Welcome!

@CloudExpo Authors: Yeshim Deniz, Elizabeth White, Pat Romanski, Liz McMillan, Zakia Bouachraoui

Related Topics: @CloudExpo, Microservices Expo, Microsoft Cloud, Containers Expo Blog, Cloud Security, @DXWorldExpo

@CloudExpo: Blog Post

Secure Cloud Backup and Data Transfer: How It’s Done

For those interested in how an enterprise-grade secure cloud backup solution handles data encryption, here is some background

Author:  Nick Mueller, Zetta.net

The Internet is a dangerous place. There's a new story on Ars Technica about a corporate or government hack almost everyday. So for those interested in how an enterprise-grade secure cloud backup solution handles data encryption, here is some background on a few of the security protocols used to protect customer data here at Zetta:

Secure Sockets Layer (SSL) / Transport Layer Security (TLS)
SSL is an Internet security protocol developed by Netscape in the late '90s that is incorporated in browsers and web servers. The protocol uses the RSA public-key/private-key encryption system and digital certificates to establish a secure connection between the client and server over which data can be transmitted.

SSL has recently evolved into the TLS protocol, but both protocols are still in use. When you see a website that starts with https instead of http, it requires an SSL/TLS connection. Both protocols are IETF standards. While the SSL Working Group is no longer active, the TLS Working Group is, and has issued a number of documents on the protocol since the start of the year. SSL/TSL differs from a complementary IETF protocol Secure HTTP (S-HTTP) in that S-HTTP is designed for sending single messages, while SSL/TSL creates a secure connection over which any amount of data can be sent.

WebDAV
Web-based Distributed Authoring and Versioning or WebDAVWebDAV is a protocol that was originally developed by University of California at Santa Cruz Computer Science professor, Jim Whitehead when he was working at the World Wide Web Consortium. It later became an IETF standard. WebDAV turns the Web into a readable and writeable asset. As the W3C defines it, "WebDAV is an HTTP-based protocol that allows accessing the contents of an HTTP server as a networked file system; to put it simply, it allows [you] to edit directly a Web site as if it were an additional disk on your computer." Linux, Mac and Windows clients all support WebDAV. Further data on the protocol is available on the IETC website, the W3C website and the WebDAV site maintained by Whitehead.

WebDAV is critical to online backup because:

  1. It allows a file to be locked so that others can't make changes when one person is working on the file
  2. When you back up files to our servers, you can directly mount those files on your own computer and work on them as if they were local files.

"WebDAV is latency independent and efficient over wide area networks, particularly when compared to file protocols like CIFS and NFS," said James E. Bagley, senior analyst, and Deni Connor, founding analyst of Storage Strategies NOW in a report released last summer. "By using WebDAV, the data is encrypted during transmission and stored quickly and efficiently. Connections are kept open continuously which reduces the amount of stress the customer network and the internet in general. This is coupled with the speed and efficiency of Zetta.net's compression and incremental-forever update technology, making it unique in terms of throughput and recovery availability."

Zetta's enterprise-grade secure cloud backup and disaster recovery solution uses WebDAV, 256-bit SSL, and Salsa20 encryption to keep your data secure. Learn more by trying Zetta free in your environment.

Nick is Zetta's Chief Content Officer, and has been working with writing and social media teams to create digital content since the days when the BBS reigned.

More Stories By Derek Kol

Derek Kol is a technology specialist focused on SMB and enterprise IT innovations.

CloudEXPO Stories
Andi Mann, Chief Technology Advocate at Splunk, is an accomplished digital business executive with extensive global expertise as a strategist, technologist, innovator, marketer, and communicator. For over 30 years across five continents, he has built success with Fortune 500 corporations, vendors, governments, and as a leading research analyst and consultant.
Most DevOps journeys involve several phases of maturity. Research shows that the inflection point where organizations begin to see maximum value is when they implement tight integration deploying their code to their infrastructure. Success at this level is the last barrier to at-will deployment. Storage, for instance, is more capable than where we read and write data. In his session at @DevOpsSummit at 20th Cloud Expo, Josh Atwell, a Developer Advocate for NetApp, will discuss the role and value extensible storage infrastructure has in accelerating software development activities, improve code quality, reveal multiple deployment options through automated testing, and support continuous integration efforts. All this will be described using tools common in DevOps organizations.
"When you think about the data center today, there's constant evolution, The evolution of the data center and the needs of the consumer of technology change, and they change constantly," stated Matt Kalmenson, VP of Sales, Service and Cloud Providers at Veeam Software, in this SYS-CON.tv interview at 18th Cloud Expo, held June 7-9, 2016, at the Javits Center in New York City, NY.
Dynatrace is an application performance management software company with products for the information technology departments and digital business owners of medium and large businesses. Building the Future of Monitoring with Artificial Intelligence Today we can collect lots and lots of performance data. We build beautiful dashboards and even have fancy query languages to access and transform the data. Still performance data is a secret language only a couple of people understand. The more business becomes digital the more stakeholders are interested in this data including how it relates to business. Some of these people have never used a monitoring tool before. They have a question on their mind like "How is my application doing" but no idea how to get a proper answer.
Today, we have more data to manage than ever. We also have better algorithms that help us access our data faster. Cloud is the driving force behind many of the data warehouse advancements we have enjoyed in recent years. But what are the best practices for storing data in the cloud for machine learning and data science applications?