Welcome!

@CloudExpo Authors: Pat Romanski, Elizabeth White, Liz McMillan, Mehdi Daoudi, Rene Buest

Related Topics: @CloudExpo, Microservices Expo, Containers Expo Blog, Cloud Security, @BigDataExpo, SDN Journal

@CloudExpo: Article

The Changing Face of Disaster Recovery in the Age of Cloud Computing

Adding robustness to an increasing number of organizations and helping create a sustainable, scalable business environment

Disaster Recovery (DR) has typically only been used by organizations for applications deemed to be mission critical. This was because organizations didn't want to incur the expense associated with DR for less important applications. Today, because of cloud computing, many organizations are considering the use of DR for as many applications as is essential for the business.

DR in the cloud is relatively a new concept still and, like many technology trends we've seen so far, there's a lot of hype and misinterpretation out there. Multiple schools of thought exist on whether or not to implement DR in the cloud.

Today, cloud-based DR has shaken up traditional legacy approaches and presents a persuasive alternative. Some organizations see cloud as a part of their DR plan, and for some, the cloud has become the de-facto DR plan. DR in the cloud is readily available and affordable for SMBs, but there is some amount of confusion out there that may give companies a false sense of security and lure them into making bad decisions.

Most companies are still looking at doing pilots for moving their less critical applications to the cloud. Given this situation, disaster recovery in the cloud remains at an early adoption stage. However, it is the SMBs that are starting to and in fact, discovering the various benefits of adopting the cloud for disaster recovery as it provides an excellent alternative for companies that don't want to, or don't have the funds to spend on disaster recovery for their secondary infrastructure. Apart from cost reductions, DR in the cloud can also help reduce usage of data center space and requirement of IT resources, and infrastructure.

A lot of smaller companies might think using free or low-cost services for backing up their data is a sufficient DR Plan. In fact, most organizations - both small and large -miss the fact that in order to provide mobility to their employees you need a robust Disaster Recovery Plan that goes well beyond dumping data on cloud storage services.

DR in the cloud is not without its share of risks. Security appears to be the main concern with a few organizations that are looking at DR in the cloud. In addition to security, another thing to consider is whether there is sufficient bandwidth, time and expertise to direct the users to the cloud in case of a true disaster.

Cloud computing can definitely help SMBs tackle critical factors that will help facilitate a nimble DR strategy. For example, the fact that you don't have to invest in specialized and at times large infrastructure saves cost and, perhaps even more importantly, saves you time as well. Cloud-based DR provides a greater flexibility than traditional DR, because the underlying computing model offers you the scale and elasticity to manage the unpredictable impact of the disaster.

There are three compelling factors enterprises need to consider when looking at adopting a DR in the cloud strategy. First, it is the CAPEX savings. Second is the agility that cloud-based DR solutions provide. Here by agility - we mean time-to-implement, demand-scale factor, time-to-respond, qualitative metrics, and how quickly you can get started. Third is the inherent DR architecture capability a DR cloud provider offers you  like options for services, locations, storage, and service levels. Cloud platforms built around virtualization technologies really speed up recovery time compared to building physical servers from scratch.

Following are the cloud-based DR strategies to consider -

  1. Cloud as a backup infrastructure site: One can always choose multiple sites from a single cloud vendor or you can try multiple vendors. Restore happens from the cloud to on-premise infrastructure in case of disaster. Now-a-days backup application vendors have been extending their backup suites with options to directly back up to popular cloud service providers such as Amazon Web Services, Windows Azure, Rackspace, etc. It's important to note that data recovery can very much derail your DR plan if cloud access mechanisms and options are not well understood and planned for carefully.
  2. Backup as well as restore to cloud: Here your data is not restored back to on-premise, but it is restored back to the cloud. This requires both compute as well as storage on the cloud, such as Amazon EC2, EBS, S3, etc. With this approach, restore is either done when required or can be done on on-going basis to ensure that your backup is fairly up to date and provides a high level of responsiveness to the business. Your management architecture should take care of business continuity and supporting higher levels of your restore time objective.
  3. Replicating applications and data on the cloud resources: This strategy of replication to cloud virtual machines provides DR for cloud applications and data, and it can be used for on-premise systems as well. Replication products come in handy for this strategy implementation; however, the cost of maintaining multiple setups and tools at all times could impact the budget negatively.
  4. Leverage traditional managed providers for cloud-hosted DR: These are the players who provide managed cloud application services and also offer managed DR as a service. They provide and have SLAs around high availability, recovery point objective, recovery time objective, etc.; however, their cost and operational mechanisms may be prohibitive given that most may still be using traditional technologies and methods to offer DR managed service.
  5. Use special purpose built DR-as-a-Service (DRaaS) solutions: Services like rCloud and others offer an affordable and complete cloud-based disaster recovery platform, and remove the whole DR planning and management hassle allowing organizations to focus on the business growth and not worry about the business continuity issues.

There are multiple strategies enterprises can use to deploy DR using cloud computing, and at the same time there are multiple technology solutions, methodologies, and practises. What enterprises need today is to ensure they have the right technology advice and best know-how guidance to ensure the business continuity through cloud.

To get started, try and test your DR plans in the cloud by using non-sensitive data first. Make this a line item in your cloud strategy roadmap and evaluate your options. Forward-thinking enterprises are embracing the changing face of business continuity and reaping the benefits of reduced effort and cost by leveraging cloud-based DR. Cloud-based DR is already delivering the kind of robustness needed for an increasing number of organizations; and paving the way for a sustainable, scalable business environment. Are you one of them?

More Stories By Jiten Patil

Jiten Patil is Principal Technology Consultant & Cloud Expert, CTO Office, at Persistent Systems Limited, a global leader in software product development and services. He has 15 years of industry experience and has spent the past 6 years working with cloud service providers, ISVs and enterprises in the field of SaaS, IaaS, PaaS and hybrid cloud computing solutions. His key expertise is in guiding organizations for cloud strategy and roadmap, solution architecting for public & private application services, platform services, multi-tenancy methodologies, application enablement and migration, devising new cloud solutions, tools and IP products, and doing competitive assessment across cloud technologies. He can be reached at [email protected] / Twitter @jiten_patil

Comments (0)

Share your thoughts on this story.

Add your comment
You must be signed in to add a comment. Sign-in | Register

In accordance with our Comment Policy, we encourage comments that are on topic, relevant and to-the-point. We will remove comments that include profanity, personal attacks, racial slurs, threats of violence, or other inappropriate material that violates our Terms and Conditions, and will block users who make repeated violations. We ask all readers to expect diversity of opinion and to treat one another with dignity and respect.


@CloudExpo Stories
The goal of Continuous Testing is to shift testing left to find defects earlier and release software faster. This can be achieved by integrating a set of open source functional and performance testing tools in the early stages of your software delivery lifecycle. There is one process that binds all application delivery stages together into one well-orchestrated machine: Continuous Testing. Continuous Testing is the conveyer belt between the Software Factory and production stages. Artifacts are m...
When shopping for a new data processing platform for IoT solutions, many development teams want to be able to test-drive options before making a choice. Yet when evaluating an IoT solution, it’s simply not feasible to do so at scale with physical devices. Building a sensor simulator is the next best choice; however, generating a realistic simulation at very high TPS with ease of configurability is a formidable challenge. When dealing with multiple application or transport protocols, you would be...
Cloud resources, although available in abundance, are inherently volatile. For transactional computing, like ERP and most enterprise software, this is a challenge as transactional integrity and data fidelity is paramount – making it a challenge to create cloud native applications while relying on RDBMS. In his session at 21st Cloud Expo, Claus Jepsen, Chief Architect and Head of Innovation Labs at Unit4, will explore that in order to create distributed and scalable solutions ensuring high availa...
An increasing number of companies are creating products that combine data with analytical capabilities. Running interactive queries on Big Data requires complex architectures to store and query data effectively, typically involving data streams, an choosing efficient file format/database and multiple independent systems that are tied together through custom-engineered pipelines. In his session at @BigDataExpo at @ThingsExpo, Tomer Levi, a senior software engineer at Intel’s Advanced Analytics ...
"We're a cybersecurity firm that specializes in engineering security solutions both at the software and hardware level. Security cannot be an after-the-fact afterthought, which is what it's become," stated Richard Blech, Chief Executive Officer at Secure Channels, in this SYS-CON.tv interview at @ThingsExpo, held November 1-3, 2016, at the Santa Clara Convention Center in Santa Clara, CA.
Enterprise architects are increasingly adopting multi-cloud strategies as they seek to utilize existing data center assets, leverage the advantages of cloud computing and avoid cloud vendor lock-in. This requires a globally aware traffic management strategy that can monitor infrastructure health across data centers and end-user experience globally, while responding to control changes and system specification at the speed of today’s DevOps teams. In his session at 20th Cloud Expo, Josh Gray, Chie...
To get the most out of their data, successful companies are not focusing on queries and data lakes, they are actively integrating analytics into their operations with a data-first application development approach. Real-time adjustments to improve revenues, reduce costs, or mitigate risk rely on applications that minimize latency on a variety of data sources. Jack Norris reviews best practices to show how companies develop, deploy, and dynamically update these applications and how this data-first...
Intelligent Automation is now one of the key business imperatives for CIOs and CISOs impacting all areas of business today. In his session at 21st Cloud Expo, Brian Boeggeman, VP Alliances & Partnerships at Ayehu, will talk about how business value is created and delivered through intelligent automation to today’s enterprises. The open ecosystem platform approach toward Intelligent Automation that Ayehu delivers to the market is core to enabling the creation of the self-driving enterprise.
"We're here to tell the world about our cloud-scale infrastructure that we have at Juniper combined with the world-class security that we put into the cloud," explained Lisa Guess, VP of Systems Engineering at Juniper Networks, in this SYS-CON.tv interview at 20th Cloud Expo, held June 6-8, 2017, at the Javits Center in New York City, NY.
Historically, some banking activities such as trading have been relying heavily on analytics and cutting edge algorithmic tools. The coming of age of powerful data analytics solutions combined with the development of intelligent algorithms have created new opportunities for financial institutions. In his session at 20th Cloud Expo, Sebastien Meunier, Head of Digital for North America at Chappuis Halder & Co., discussed how these tools can be leveraged to develop a lasting competitive advantage ...
As businesses adopt functionalities in cloud computing, it’s imperative that IT operations consistently ensure cloud systems work correctly – all of the time, and to their best capabilities. In his session at @BigDataExpo, Bernd Harzog, CEO and founder of OpsDataStore, presented an industry answer to the common question, “Are you running IT operations as efficiently and as cost effectively as you need to?” He then expounded on the industry issues he frequently came up against as an analyst, and ...
The question before companies today is not whether to become intelligent, it’s a question of how and how fast. The key is to adopt and deploy an intelligent application strategy while simultaneously preparing to scale that intelligence. In her session at 21st Cloud Expo, Sangeeta Chakraborty, Chief Customer Officer at Ayasdi, will provide a tactical framework to become a truly intelligent enterprise, including how to identify the right applications for AI, how to build a Center of Excellence to ...
In his session at 20th Cloud Expo, Mike Johnston, an infrastructure engineer at Supergiant.io, discussed how to use Kubernetes to set up a SaaS infrastructure for your business. Mike Johnston is an infrastructure engineer at Supergiant.io with over 12 years of experience designing, deploying, and maintaining server and workstation infrastructure at all scales. He has experience with brick and mortar data centers as well as cloud providers like Digital Ocean, Amazon Web Services, and Rackspace. H...
You know you need the cloud, but you’re hesitant to simply dump everything at Amazon since you know that not all workloads are suitable for cloud. You know that you want the kind of ease of use and scalability that you get with public cloud, but your applications are architected in a way that makes the public cloud a non-starter. You’re looking at private cloud solutions based on hyperconverged infrastructure, but you’re concerned with the limits inherent in those technologies.
SYS-CON Events announced today that Massive Networks will exhibit at SYS-CON's 21st International Cloud Expo®, which will take place on Oct 31 – Nov 2, 2017, at the Santa Clara Convention Center in Santa Clara, CA. Massive Networks mission is simple. To help your business operate seamlessly with fast, reliable, and secure internet and network solutions. Improve your customer's experience with outstanding connections to your cloud.
DevOps is under attack because developers don’t want to mess with infrastructure. They will happily own their code into production, but want to use platforms instead of raw automation. That’s changing the landscape that we understand as DevOps with both architecture concepts (CloudNative) and process redefinition (SRE). Rob Hirschfeld’s recent work in Kubernetes operations has led to the conclusion that containers and related platforms have changed the way we should be thinking about DevOps and...
Everything run by electricity will eventually be connected to the Internet. Get ahead of the Internet of Things revolution and join Akvelon expert and IoT industry leader, Sergey Grebnov, in his session at @ThingsExpo, for an educational dive into the world of managing your home, workplace and all the devices they contain with the power of machine-based AI and intelligent Bot services for a completely streamlined experience.
Because IoT devices are deployed in mission-critical environments more than ever before, it’s increasingly imperative they be truly smart. IoT sensors simply stockpiling data isn’t useful. IoT must be artificially and naturally intelligent in order to provide more value In his session at @ThingsExpo, John Crupi, Vice President and Engineering System Architect at Greenwave Systems, will discuss how IoT artificial intelligence (AI) can be carried out via edge analytics and machine learning techn...
FinTechs use the cloud to operate at the speed and scale of digital financial activity, but are often hindered by the complexity of managing security and compliance in the cloud. In his session at 20th Cloud Expo, Sesh Murthy, co-founder and CTO of Cloud Raxak, showed how proactive and automated cloud security enables FinTechs to leverage the cloud to achieve their business goals. Through business-driven cloud security, FinTechs can speed time-to-market, diminish risk and costs, maintain continu...
SYS-CON Events announced today that Datera, that offers a radically new data management architecture, has been named "Exhibitor" of SYS-CON's 21st International Cloud Expo ®, which will take place on Oct 31 - Nov 2, 2017, at the Santa Clara Convention Center in Santa Clara, CA. Datera is transforming the traditional datacenter model through modern cloud simplicity. The technology industry is at another major inflection point. The rise of mobile, the Internet of Things, data storage and Big...