Welcome!

@CloudExpo Authors: Zakia Bouachraoui, Elizabeth White, Yeshim Deniz, Liz McMillan, Pat Romanski

Related Topics: @CloudExpo

@CloudExpo: Article

Cloud Computing's Datacenters: How Safe is Safe?

Recent Catastrophic Events Bring the Subject of Location into Focus

A question came up recently at a Cloud Computing conference in Manila: given the Philippines' history of natural disasters, how confident can investors really be about building mission-critical datacenters in the country?

One member of a special panel noted that there were no datacenter or call-center outages in the Philippines during the Ondoy and Peping typhoon disasters in late 2009.

Another noted, "if your datacenter goes down here, then you have a lot bigger problems than your datacenter going down."

This comment was, unfortunately, prescient given the ongoing disaster in northern Japan. The catastrophic failure of nuclear power plants in the Ring of Fire is indeed a bigger problem, while also focusing on the question of building datacenters on such dangerous ground.

Global Danger Zones
Without getting pedantic, we must nevertheless ask, "how safe is safe and how dangerous is dangerous?" Certainly California is also in the Ring of Fire, albeit not subject to the subduction quakes that create large, local tsunamis.

Seattle, on the other hand, is in a subduction zone similar to those of Japan, the Philippines, and Indonesia. And peaceful New Zealand's recent catastrophic quake proved that you don't need an 8.0+ subduction shake to have a disaster.

Recent flooding in New South Wales, Australia would most certainly have devastated any datacenters in its path.

The long, hard winter in the American Northeast and Midwest have shown how even the hardiest of cities can become paralyzed by the elements. And anyone from the American Midwest and South can tell you about the potential of tornadoes to terrorize and obliterate.

Where should we put all these datacenters, then? France?

How Important a Factor?
Despite the scary images and grim prognosis from Japan, so far the nuclear plant "events" are being gauged as a 4 on the 7-point logarithmic INES scale that tries to measure these things. The Chernobyl blast in 1986 is the only 7-point incident so far; Three Mile Island in 1979 was a 5 on this scale.

And in the Philippines, a TV poll from March 14 found that 69% of respondents were still in favor of completing a long-abandoned nuclear-power project that was started during the Marcos administration.

The Philippines has very expensive power, even as it uses only about 3% of the electricity per capita of North America, Western Europe, or Japan. The dear cost of electricity has been a dealbreaker for many large industrial proposals; without cheaper power, the country will continue to lose out to Malaysia, Indonesia, and Vietnam.

The country also suffers from a constitutional limit of 40% foreign ownership for most projects.

So basic financial fundamentals will probably trump plate tectonics and the weather when it comes to building the oodles of new datacenters that global Cloud Computing will demand over the next several decades.

Latency and Privacy
There are also the issues of latency and privacy.

Latency comes into play in financial markets in particular, where performance delays are measured in microseconds. For many companies and industries, "real time" means accounting for the speed of light and measuring datacenter proximity in terms of feet, not miles, let alone thousands of miles.

Privacy in particular and data integrity in general are big, politically driven issues that can require datacenters to be located in the country where the data originates.

What about your organization? How important is location with respect to a.) potential natural disaster b.) latency c.) data integrity?

Goto www.rogerstrukhoff.sys-con.com or www.twitter.com/strukhoff and shoot me an email or tweet.

 

 

More Stories By Roger Strukhoff

Roger Strukhoff (@IoT2040) is Executive Director of the Tau Institute for Global ICT Research, with offices in Illinois and Manila. He is Conference Chair of @CloudExpo & @ThingsExpo, and Editor of SYS-CON Media's CloudComputing BigData & IoT Journals. He holds a BA from Knox College & conducted MBA studies at CSU-East Bay.

CloudEXPO Stories
Dynatrace is an application performance management software company with products for the information technology departments and digital business owners of medium and large businesses. Building the Future of Monitoring with Artificial Intelligence. Today we can collect lots and lots of performance data. We build beautiful dashboards and even have fancy query languages to access and transform the data. Still performance data is a secret language only a couple of people understand. The more business becomes digital the more stakeholders are interested in this data including how it relates to business. Some of these people have never used a monitoring tool before. They have a question on their mind like "How is my application doing" but no idea how to get a proper answer.
Having been in the web hosting industry since 2002, dhosting has gained a great deal of experience while working on a wide range of projects. This experience has enabled the company to develop our amazing new product, which they are now excited to present! Among dHosting's greatest achievements, they can include the development of their own hosting panel, the building of their fully redundant server system, and the creation of dhHosting's unique product, Dynamic Edge.
Your job is mostly boring. Many of the IT operations tasks you perform on a day-to-day basis are repetitive and dull. Utilizing automation can improve your work life, automating away the drudgery and embracing the passion for technology that got you started in the first place. In this presentation, I'll talk about what automation is, and how to approach implementing it in the context of IT Operations. Ned will discuss keys to success in the long term and include practical real-world examples. Get started on automating your way to a brighter future!
The challenges of aggregating data from consumer-oriented devices, such as wearable technologies and smart thermostats, are fairly well-understood. However, there are a new set of challenges for IoT devices that generate megabytes or gigabytes of data per second. Certainly, the infrastructure will have to change, as those volumes of data will likely overwhelm the available bandwidth for aggregating the data into a central repository. Ochandarena discusses a whole new way to think about your next-gen applications and how to address the challenges of building applications that harness all data types and sources.
Lori MacVittie is a subject matter expert on emerging technology responsible for outbound evangelism across F5's entire product suite. MacVittie has extensive development and technical architecture experience in both high-tech and enterprise organizations, in addition to network and systems administration expertise. Prior to joining F5, MacVittie was an award-winning technology editor at Network Computing Magazine where she evaluated and tested application-focused technologies including app security and encryption-related solutions. She holds a B.S. in Information and Computing Science from the University of Wisconsin at Green Bay, and an M.S. in Computer Science from Nova Southeastern University, and is an O'Reilly author.