@CloudExpo Authors: Elizabeth White, Pat Romanski, Yeshim Deniz, Liz McMillan, Zakia Bouachraoui

Related Topics: @CloudExpo, Microservices Expo, Containers Expo Blog, Cloud Security, @DXWorldExpo, SDN Journal

@CloudExpo: Blog Feed Post

The Importance of Disaster Recovery Testing

Don't lose sight of the basics of a DR plan

In this business, we talk quite a lot about the importance of disaster recovery plans and strategies. Today's organizations face a multitude of choices when it comes to building a disaster recovery (DR) plan to protect their data and provide business continuity in the case of outages and other disasters. Available options include building out a second data center/disaster site, populating a colocation facility with redundant hardware, outsourcing recovery to a hosting provider or utilizing cloud compute and storage for an on-demand recovery strategy. And yet, despite all those options and all that discussion, a recent IDG research survey found that 42% of companies still did not have disaster recovery solutions early in 2013.

In the past, I have talked about the many benefits of an on-demand recovery strategy using cloud-integrated storage solutions. Regardless of the solution, key to the DR plan is the ability to satisfy the recovery time and recovery point objectives (RPO & RPO) of your business - in a way that makes economic sense. At that point, however, many organizations stop. They build their plan. They even execute their plan so that the pieces are in place. But a recovery plan and good intentions are not enough. A wiser approach includes regularly validating your solution via disaster testing.

A recent survey by Kroll Ontrack showed that 67% of organizations leveraging cloud/virtualization do not test their disaster recovery plan regularly.

To that end, disaster recovery testing is essential to understanding whether recovery objectives will be met if a disaster strikes. It's far better to avoid waiting for the inevitable disaster, like the Hurricane Sandy outages that affected the US east coast in 2012, to exercise your DR plan. A true disaster would be an inopportune time to realize there were a few recovery details you might have overlooked during planning.

Why don't more organizations make DR and DR testing a priority? Well, there a couple of key reasons.

First, disaster recovery strategies have traditionally been expensive, requiring redundant infrastructure that is rarely used outside of a DR test or true disaster. Many organizations opt not to put a DR plan in place because it is cost-prohibitive, and there are no regulatory requirements forcing them to do so. Second, DR infrastructure is expensive to test and can be disruptive to normal business operations, requiring an outage in order to bring up data and applications in a disaster recovery environment - and outages are what businesses want to avoid in the first place.

The good news is that cloud and technology improvements have made both on-demand DR and DR test viable for nearly any organization. Cloud-integrated storage solutions enable on-demand DR, without dedicated redundant infrastructure or costs when the infrastructure is not in use. These solutions make DR planning and test viable for organizations that simply could not afford a DR plan. Even better are solutions with the ability to access snapshots of data online, without bringing down or pausing primary operations, offering an affordable DR strategy that can also be validated regularly without any disruption or outage.

Cloud may not solve every recovery headache and may not be the perfect solution for every circumstance. However, organizations lacking a DR plan or a viable way of testing their DR plan should seriously consider the cloud alternatives available today.

More Stories By Nicos Vekiarides

Nicos Vekiarides is the Chief Executive Officer & Co-Founder of TwinStrata. He has spent over 20 years in enterprise data storage, both as a business manager and as an entrepreneur and founder in startup companies.

Prior to TwinStrata, he served as VP of Product Strategy and Technology at Incipient, Inc., where he helped deliver the industry's first storage virtualization solution embedded in a switch. Prior to Incipient, he was General Manager of the storage virtualization business at Hewlett-Packard. Vekiarides came to HP with the acquisition of StorageApps where he was the founding VP of Engineering. At StorageApps, he built a team that brought to market the industry's first storage virtualization appliance. Prior to StorageApps, he spent a number of years in the data storage industry working at Sun Microsystems and Encore Computer. At Encore, he architected and delivered Encore Computer's SP data replication products that were a key factor in the acquisition of Encore's storage division by Sun Microsystems.

CloudEXPO Stories
DXWorldEXPO LLC announced today that Kevin Jackson joined the faculty of CloudEXPO's "10-Year Anniversary Event" which will take place on November 11-13, 2018 in New York City. Kevin L. Jackson is a globally recognized cloud computing expert and Founder/Author of the award winning "Cloud Musings" blog. Mr. Jackson has also been recognized as a "Top 100 Cybersecurity Influencer and Brand" by Onalytica (2015), a Huffington Post "Top 100 Cloud Computing Experts on Twitter" (2013) and a "Top 50 Cloud Computing Blogger for IT Integrators" by CRN (2015). Mr. Jackson's professional career includes service in the US Navy Space Systems Command, Vice President J.P. Morgan Chase, Worldwide Sales Executive for IBM and NJVC Vice President, Cloud Services. He is currently part of a team responsible for onboarding mission applications to the US Intelligence Community cloud computing environment (IC ...
When talking IoT we often focus on the devices, the sensors, the hardware itself. The new smart appliances, the new smart or self-driving cars (which are amalgamations of many ‘things'). When we are looking at the world of IoT, we should take a step back, look at the big picture. What value are these devices providing. IoT is not about the devices, its about the data consumed and generated. The devices are tools, mechanisms, conduits. This paper discusses the considerations when dealing with the massive amount of information associated with these devices. Ed presented sought out sessions at CloudEXPO Silicon Valley 2017 and CloudEXPO New York 2017. He is a regular contributor to Cloud Computing Journal.
Charles Araujo is an industry analyst, internationally recognized authority on the Digital Enterprise and author of The Quantum Age of IT: Why Everything You Know About IT is About to Change. As Principal Analyst with Intellyx, he writes, speaks and advises organizations on how to navigate through this time of disruption. He is also the founder of The Institute for Digital Transformation and a sought after keynote speaker. He has been a regular contributor to both InformationWeek and CIO Insight and has been quoted or published in Time, CIO, Computerworld, USA Today and Forbes.
They say multi-cloud is coming, but organizations are leveraging multiple clouds already. According to a study by 451 Research, only 21% of organizations were using a single cloud. If you've found yourself unprepared for the barrage of cloud services introduced in your organization, you will need to change your approach to engaging with the business and engaging with vendors. Look at technologies that are on the way and work with the internal players involved to have a plan in place when the inevitable happens and the business begins to look at how these things can help affect your bottom line.
Machine learning has taken residence at our cities' cores and now we can finally have "smart cities." Cities are a collection of buildings made to provide the structure and safety necessary for people to function, create and survive. Buildings are a pool of ever-changing performance data from large automated systems such as heating and cooling to the people that live and work within them. Through machine learning, buildings can optimize performance, reduce costs, and improve occupant comfort by sharing information within the building and with outside city infrastructure via real time shared cloud capabilities.