Welcome!

@CloudExpo Authors: Elizabeth White, Yeshim Deniz, Pat Romanski, Liz McMillan, Zakia Bouachraoui

Related Topics: @CloudExpo, Microservices Expo, Apache, Cloud Security, @DXWorldExpo, SDN Journal

@CloudExpo: Blog Feed Post

Disasters & Data: What You Need To Know

Data DR need to be considered past simple infrastructure business continuity planning

If you’re doing business in this era, then you’re dealing with a lot of data. That data is everywhere as well, especially if parts of your business are global. Depending on where you’re doing business, the data your business is generating has to adhere to different data rules, especially where disaster recovery and business continuity are concerned.

Multiple Approaches, Multiple Concerns
There are two ways to look at big data: it either means you have a lot of data, or it means that you’re doing a lot of data processing. Often times, you’re doing both.

Transfer of documents.

Data DR need to be considered past simple infrastructure business continuity planning.

Often data is considered secondary (and thus not important), generated out of their primary application, and the company is using it to better understand how an application is working. Generally in cases such as this, the emphasis to protect data is not top priority. All businesses in this situation want are service credits towards when the infrastructure supporting data is down.

Meanwhile, others are using data processing as part of their mission-critical, revenue generating work, so the data becomes a goal in its own right. Some businesses that approach data this way are advertising, natural language processing, finance, and healthcare, among other primary forms of data analytics as well as data gathering focused businesses.

Geo-diversity Challenges
More often than not, the real solution for ensuring that data is safe is to find a storage system that is geographically replicated at the time of data creation. With big data, we are typically talking about petabytes of data. Moving that much data, however, is still a major hassle.

When you’re dealing with that level of volume, you want storage systems that are meshed, with asynchronous consistency between regions. You want to avoid needing to restore 10 petabytes of data from backup tapes. It’s a nearly impossible task. Remember, data isn’t backed up per say, but rather it’s replicated in the storage system you’re using.

Coupling that data with processing power is another issue. If you’re processing that much data, you have a server farm doing it, and it needs to be proximal to the data for it to be effective. Whether you’re using Hadoop, Oracle, or SQL warehouse tools, but something needs to be done with the data. If you have proper geographic replication of the data then typically you look to a provider based solution as preferable: something close to the data in each facility but you only pay for one at a time.

The Difficulty With Data
Something else to consider is that data is fairly immutable: it grows, but the originating data is unchanging. If there is a failure, a company has to start from the last known batch ending to reprocess. Maybe that represents a loss of 12 or 6 hours, or perhaps it’s just 10 minutes. Those numbers have to be defined to the point where they make sense in terms of impact to business continuity, since such determinations affects how you do batch data processing.

How does your business handle big data disaster recovery in the cloud? Let us know @CloudGathering.

By Jake Gardner

Read the original blog entry...

More Stories By Gathering Clouds

Cloud computing news, information, and insights. Powered by Logicworks.

CloudEXPO Stories
"Calligo is a cloud service provider with data privacy at the heart of what we do. We are a typical Infrastructure as a Service cloud provider but it's been designed around data privacy," explained Julian Box, CEO and co-founder of Calligo, in this SYS-CON.tv interview at 21st Cloud Expo, held Oct 31 – Nov 2, 2017, at the Santa Clara Convention Center in Santa Clara, CA.
We are seeing a major migration of enterprises applications to the cloud. As cloud and business use of real time applications accelerate, legacy networks are no longer able to architecturally support cloud adoption and deliver the performance and security required by highly distributed enterprises. These outdated solutions have become more costly and complicated to implement, install, manage, and maintain.SD-WAN offers unlimited capabilities for accessing the benefits of the cloud and Internet. SD-WAN helps enterprises to take advantage of the exploding landscape of cloud applications and services, due to its unique capability to support all things cloud related.
Discussions of cloud computing have evolved in recent years from a focus on specific types of cloud, to a world of hybrid cloud, and to a world dominated by the APIs that make today's multi-cloud environments and hybrid clouds possible. In this Power Panel at 17th Cloud Expo, moderated by Conference Chair Roger Strukhoff, panelists addressed the importance of customers being able to use the specific technologies they need, through environments and ecosystems that expose their APIs to make true change and transformation possible.
Containers and Kubernetes allow for code portability across on-premise VMs, bare metal, or multiple cloud provider environments. Yet, despite this portability promise, developers may include configuration and application definitions that constrain or even eliminate application portability. In this session we'll describe best practices for "configuration as code" in a Kubernetes environment. We will demonstrate how a properly constructed containerized app can be deployed to both Amazon and Azure using the Kublr platform, and how Kubernetes objects, such as persistent volumes, ingress rules, and services, can be used to abstract from the infrastructure.
Business professionals no longer wonder if they'll migrate to the cloud; it's now a matter of when. The cloud environment has proved to be a major force in transitioning to an agile business model that enables quick decisions and fast implementation that solidify customer relationships. And when the cloud is combined with the power of cognitive computing, it drives innovation and transformation that achieves astounding competitive advantage.