Welcome!

Cloud Expo Authors: Sematext Blog, Carmen Gonzalez, Liz McMillan, Pat Romanski, Nitin Bandugula

Related Topics: Cloud Expo, SOA & WOA, Virtualization, Security, Big Data Journal, SDN Journal

Cloud Expo: Blog Feed Post

Can the Cloud Do ‘In Perpetuity’?

One thing, of course, that most public cloud providers are good at is offering a platform upon which others can build

Cloud computing is great, right? As a way to get something up and running quickly, affordably, and with a minimum of fuss, it can rarely be beaten.

But some of the most compelling attributes of the public cloud are best suited to ephemeral or (relatively!) short-term use cases. You can spin up a cloud server in minutes. You can scale a cloud-based application to cope with the peaks and troughs of demand. You can control all of this through a web console, with no more than a credit card and a laptop. Silicon Valley, SoMa, Silicon Alley, Silicon Roundabout, Silicon Allee, Silicon Wadi, Silicon Forest, Silicon Welly, and the Silicon Bog (only one of those was made up, I think) are full to bursting with bright young things building exciting new products (and silly photo sharing sites) powered only by the cloud and expensive coffee.

3166391937_f273e4e212_zAnd then you have government, private, and commercial Archives, with an over-riding imperative to keep stuff for a very, very long time. These Archives clearly can (and do) use cloud computing in the same ways as everyone else. They use clouds to cost-effectively transform data from one format to another, they use clouds to stream large and popular media files to the public, and they use clouds in all sorts of other ways to make innumerable workflows and processes easier, cheaper, or more robust. For those use cases, even the biggest, grandest, and most important of archives is actually pretty much like any other user. Cloud’s as useful to them as it is to the rest of us, and that’s great.

Does it make sense, though, for Archives to entrust any of their long-term preservation role to the cloud? I’m not sure (yet), but The National Archives (TNA) here in the UK wants to find out. They’ve commissioned a study from a small consultancy, Charles Beagrie, and I’m subcontracted to provide a bit of cloud knowledge to the team.

Out of the box, you’d have to question the sense of an archive entrusting anything to the public cloud for purposes of long-term preservation. That’s not really what Amazon’s Simple Storage Service or Rackspace’s Cloud Files or any of the other cloud-based filestores are for. Their Service Level Agreements and their technical underpinnings are all about cost-effectively storing lots of stuff and losing as little as possible. If a file is lost or damaged, the service provider might pay out a few service credits, and/or the customer might restore from a backup, and everyone continues on their way. Archivists, we were reminded at one of the project’s focus groups, have this peculiar expectation that the systems they use to preserve their primary materials won’t lose anything at all. A couple of service credits don’t really help when you just lost, truncated, or changed a few words in the digital equivalent of the Magna Carta or the Domesday Book or the Book of Kells or the Declaration of Arbroath. And, just to be totally clear, losing a digital copy of the Declaration of Arbroath would be ok. The National Archives of Scotland still has the vellum (I presume their copy was written on vellum?) in a climate-controlled vault. They probably also have a CD or two of backups for the digital images. Things become a bit more serious when the content is ‘born digital,’ and the file you’re preserving is the thing itself and not just an image of some physical artefact.

Even with archival-ish services like Glacier, which Amazon says

is designed to provide average annual durability of 99.999999999% for an archive. The service redundantly stores data in multiple facilities and on multiple devices within each facility. To increase durability, Amazon Glacier synchronously stores your data across multiple facilities before returning SUCCESS on uploading archives. Unlike traditional systems that can require laborious data verification and manual repair, Glacier performs regular, systematic data integrity checks and is built to be automatically self-healing,

(my emphasis)

the big public cloud providers aren’t really in the business of supporting the extreme needs of an Archive. Archives demand a whole extra level of error checking, resilience, redundancy and integrity, and it would be cost-prohibitive for AWS and their competitors to do all that across their sprawling data centres when most customers are actually perfectly happy with “redundantly stores data in multiple facilities” and “automatically self-healing.”

Interestingly, Seagate sees value in offering a Glacier competitor capable of storing data “intact for decades” and offering access instantly rather than in a matter of hours as Glacier does. As it’s based in Utah I doubt that European government archives would touch it, but it will be interesting to see whether their North American cousins show any interest…

One thing, of course, that most public cloud providers are good at is offering a platform upon which others can build. Archivists, like others, have begun to layer rules, policies, procedures and processes on top of the bare-bones cloud infrastructure offerings, to build something a little more robust and dependable. Services like DuraCloud take AWS and Rackspace (currently only in their US data centres, but that could change), and add things like proactive error checking and even more backups to deliver something that an archivist might be prepared to trust.

There’s a use case here, and there are plenty of (mostly university) archives in the States putting DuraCloud and similar cloud-powered tools to work as part of their preservation strategy.

But I can’t help wondering if some great big enterprise data management solution, with multiply redundant disks, multiply redundant backups and a whole heap of watertight, ironclad, fault tolerant, and ridiculously over-specified policies might be a better (albeit eye-wateringly expensive) way to preserve the truly irreplaceable? Either that, or archives and archivists need to explicitly embrace a more pragmatic approach to what they’re attempting with these systems.

‘Design for failure’ is a core tenet of cloud-powered systems. What’s the archival equivalent? ‘Lose nothing, ever’ just won’t cut it.

Disclaimer: Charles Beagrie is a client. TNA is a client of theirs. This post is not part of the project. Any opinions expressed here are my own, a work in progress… and subject to change!

Image of The National Archives by Flickr user ‘electropod’

Read the original blog entry...

More Stories By Paul Miller

Paul Miller works at the interface between the worlds of Cloud Computing and the Semantic Web, providing the insights that enable you to exploit the next wave as we approach the World Wide Database. He blogs at www.cloudofdata.com.

@CloudExpo Stories
In a world of ever-accelerating business cycles and fast-changing client expectations, the cloud increasingly serves as a growth engine and a path to new business models. Dynamic clouds enable businesses to continuously reinvent themselves, adapting their business processes, their service and software delivery and their operations to achieve speed-to-market and quick response to customer feedback. As the cloud evolves, the industry has multiple competing cloud technologies, offering on-premises ...
DevOps Summit 2015 New York, co-located with the 16th International Cloud Expo - to be held June 9-11, 2015, at the Javits Center in New York City, NY - announces that it is now accepting Keynote Proposals. The widespread success of cloud computing is driving the DevOps revolution in enterprise IT. Now as never before, development teams must communicate and collaborate in a dynamic, 24/7/365 environment. There is no time to wait for long development cycles that produce software that is obsolete...
Connected devices and the Internet of Things are getting significant momentum in 2014. In his session at Internet of @ThingsExpo, Jim Hunter, Chief Scientist & Technology Evangelist at Greenwave Systems, examined three key elements that together will drive mass adoption of the IoT before the end of 2015. The first element is the recent advent of robust open source protocols (like AllJoyn and WebRTC) that facilitate M2M communication. The second is broad availability of flexible, cost-effective ...
“We are strong believers in the DevOps movement and our staff has been doing DevOps for large enterprise environments for a number of years. The solution that we build is intended to allow DevOps teams to do security at the speed of DevOps," explained Justin Lundy, Founder & CTO of Evident.io, in this SYS-CON.tv interview at DevOps Summit, held Nov 4–6, 2014, at the Santa Clara Convention Center in Santa Clara, CA.
SYS-CON Events announced today that that Innodisk, the service-driven provider of industrial embedded flash and DRAM storage products and technologies, will exhibit at SYS-CON's 16th International Cloud Expo®, which will take place on June 9-11, 2015, at the Javits Center in New York City, NY. Innodisk is a service-driven provider of industrial embedded flash and DRAM storage products and technologies. With satisfied customers across the embedded, aerospace and defense, cloud storage markets an...
The Internet of Things will put IT to its ultimate test by creating infinite new opportunities to digitize products and services, generate and analyze new data to improve customer satisfaction, and discover new ways to gain a competitive advantage across nearly every industry. In order to help corporate business units to capitalize on the rapidly evolving IoT opportunities, IT must stand up to a new set of challenges. In his session at @ThingsExpo, Jeff Kaplan, Managing Director of THINKstrateg...
"Our premise is Docker is not enough. That's not a bad thing - we actually love Docker. At ActiveState all our products are based on open source technology and Docker is an up-and-coming piece of open source technology," explained Bart Copeland, President & CEO of ActiveState Software, in this SYS-CON.tv interview at DevOps Summit at Cloud Expo®, held Nov 4-6, 2014, at the Santa Clara Convention Center in Santa Clara, CA.
Eighty-five percent of companies store information in some sort of unstructured manner. In this demo at 15th Cloud Expo, Mark Fronczak, Product Manager at Solgenia, discussed their enterprise content management solution, which was created to help companies organize and take control of their digital assets.
The BPM world is going through some evolution or changes where traditional business process management solutions really have nowhere to go in terms of development of the road map. In this demo at 15th Cloud Expo, Kyle Hansen, Director of Professional Services at AgilePoint, shows AgilePoint’s unique approach to dealing with this market circumstance by developing a rapid application composition or development framework.
Fundamentally, SDN is still mostly about network plumbing. While plumbing may be useful to tinker with, what you can do with your plumbing is far more intriguing. A rigid interpretation of SDN confines it to Layers 2 and 3, and that's reasonable. But SDN opens opportunities for novel constructions in Layers 4 to 7 that solve real operational problems in data centers. "Data center," in fact, might become anachronistic - data is everywhere, constantly on the move, seemingly always overflowing. Net...
Leysin American School is an exclusive, private boarding school located in Leysin, Switzerland. Leysin selected an OpenStack-powered, private cloud as a service to manage multiple applications and provide development environments for students across the institution. Seeking to meet rigid data sovereignty and data integrity requirements while offering flexible, on-demand cloud resources to users, Leysin identified OpenStack as the clear choice to round out the school's cloud strategy. Additional...

ARMONK, N.Y., Nov. 20, 2014 /PRNewswire/ --  IBM (NYSE: IBM) today announced that it is bringing a greater level of control, security and flexibility to cloud-based application development and delivery with a single-tenant version of Bluemix, IBM's

The move in recent years to cloud computing services and architectures has added significant pace to the application development and deployment environment. When enterprise IT can spin up large computing instances in just minutes, developers can also design and deploy in small time frames that were unimaginable a few years ago. The consequent move toward lean, agile, and fast development leads to the need for the development and operations sides to work very closely together. Thus, DevOps become...
An entirely new security model is needed for the Internet of Things, or is it? Can we save some old and tested controls for this new and different environment? In his session at @ThingsExpo, New York's at the Javits Center, Davi Ottenheimer, EMC Senior Director of Trust, reviewed hands-on lessons with IoT devices and reveal a new risk balance you might not expect. Davi Ottenheimer, EMC Senior Director of Trust, has more than nineteen years' experience managing global security operations and asse...
DevOps is all about agility. However, you don't want to be on a high-speed bus to nowhere. The right DevOps approach controls velocity with a tight feedback loop that not only consists of operational data but also incorporates business context. With a business context in the decision making, the right business priorities are incorporated, which results in a higher value creation. In his session at DevOps Summit, Todd Rader, Solutions Architect at AppDynamics, discussed key monitoring techniques...
IBM has announced software that allows people to hide or anonymize their personal information on the Web, ensuring protection from identity theft and other misuse. Developed by researchers at IBM's laboratory in Zurich, Switzerland, the software – called Identity Mixer – will enable consumers to purchase goods and services on the Internet without disclosing personal information. As consumers hand over personal details in exchange for downloading music or subscribing to online newsletters, they...
Building low-cost wearable devices can enhance the quality of our lives. In his session at Internet of @ThingsExpo, Sai Yamanoor, Embedded Software Engineer at Altschool, provided an example of putting together a small keychain within a $50 budget that educates the user about the air quality in their surroundings. He also provided examples such as building a wearable device that provides transit or recreational information. He then reviewed the resources available to build wearable devices at ...
The Internet of Things promises to transform businesses (and lives), but navigating the business and technical path to success can be difficult to understand. In his session at @ThingsExpo, Sean Lorenz, Technical Product Manager for Xively at LogMeIn, demonstrated how to approach creating broadly successful connected customer solutions using real world business transformation studies including New England BioLabs and more.
The Internet of Things is not new. Historically, smart businesses have used its basic concept of leveraging data to drive better decision making and have capitalized on those insights to realize additional revenue opportunities. So, what has changed to make the Internet of Things one of the hottest topics in tech? In his session at @ThingsExpo, Chris Gray, Director, Embedded and Internet of Things, discussed the underlying factors that are driving the economics of intelligent systems. Discover ...
Mobile commerce traffic is surpassing desktop, yet less than 20% of sales in the U.S. are mobile commerce sales. In his session at 15th Cloud Expo, Dan Franklin, Segment Manager, Commerce, at Verizon Digital Media Services, defined mobile devices and discussed how next generation means simplification. It means taking your digital content and turning it into instantly gratifying experiences.