Welcome!

@CloudExpo Authors: Ed Featherston, Rostyslav Demush, Jamie Madison, Jason Bloomberg, Greg Pierce

Related Topics: @CloudExpo, Microservices Expo, Containers Expo Blog, Cloud Security, @DXWorldExpo, SDN Journal

@CloudExpo: Blog Feed Post

Can the Cloud Do ‘In Perpetuity’?

One thing, of course, that most public cloud providers are good at is offering a platform upon which others can build

Cloud computing is great, right? As a way to get something up and running quickly, affordably, and with a minimum of fuss, it can rarely be beaten.

But some of the most compelling attributes of the public cloud are best suited to ephemeral or (relatively!) short-term use cases. You can spin up a cloud server in minutes. You can scale a cloud-based application to cope with the peaks and troughs of demand. You can control all of this through a web console, with no more than a credit card and a laptop. Silicon Valley, SoMa, Silicon Alley, Silicon Roundabout, Silicon Allee, Silicon Wadi, Silicon Forest, Silicon Welly, and the Silicon Bog (only one of those was made up, I think) are full to bursting with bright young things building exciting new products (and silly photo sharing sites) powered only by the cloud and expensive coffee.

3166391937_f273e4e212_zAnd then you have government, private, and commercial Archives, with an over-riding imperative to keep stuff for a very, very long time. These Archives clearly can (and do) use cloud computing in the same ways as everyone else. They use clouds to cost-effectively transform data from one format to another, they use clouds to stream large and popular media files to the public, and they use clouds in all sorts of other ways to make innumerable workflows and processes easier, cheaper, or more robust. For those use cases, even the biggest, grandest, and most important of archives is actually pretty much like any other user. Cloud’s as useful to them as it is to the rest of us, and that’s great.

Does it make sense, though, for Archives to entrust any of their long-term preservation role to the cloud? I’m not sure (yet), but The National Archives (TNA) here in the UK wants to find out. They’ve commissioned a study from a small consultancy, Charles Beagrie, and I’m subcontracted to provide a bit of cloud knowledge to the team.

Out of the box, you’d have to question the sense of an archive entrusting anything to the public cloud for purposes of long-term preservation. That’s not really what Amazon’s Simple Storage Service or Rackspace’s Cloud Files or any of the other cloud-based filestores are for. Their Service Level Agreements and their technical underpinnings are all about cost-effectively storing lots of stuff and losing as little as possible. If a file is lost or damaged, the service provider might pay out a few service credits, and/or the customer might restore from a backup, and everyone continues on their way. Archivists, we were reminded at one of the project’s focus groups, have this peculiar expectation that the systems they use to preserve their primary materials won’t lose anything at all. A couple of service credits don’t really help when you just lost, truncated, or changed a few words in the digital equivalent of the Magna Carta or the Domesday Book or the Book of Kells or the Declaration of Arbroath. And, just to be totally clear, losing a digital copy of the Declaration of Arbroath would be ok. The National Archives of Scotland still has the vellum (I presume their copy was written on vellum?) in a climate-controlled vault. They probably also have a CD or two of backups for the digital images. Things become a bit more serious when the content is ‘born digital,’ and the file you’re preserving is the thing itself and not just an image of some physical artefact.

Even with archival-ish services like Glacier, which Amazon says

is designed to provide average annual durability of 99.999999999% for an archive. The service redundantly stores data in multiple facilities and on multiple devices within each facility. To increase durability, Amazon Glacier synchronously stores your data across multiple facilities before returning SUCCESS on uploading archives. Unlike traditional systems that can require laborious data verification and manual repair, Glacier performs regular, systematic data integrity checks and is built to be automatically self-healing,

(my emphasis)

the big public cloud providers aren’t really in the business of supporting the extreme needs of an Archive. Archives demand a whole extra level of error checking, resilience, redundancy and integrity, and it would be cost-prohibitive for AWS and their competitors to do all that across their sprawling data centres when most customers are actually perfectly happy with “redundantly stores data in multiple facilities” and “automatically self-healing.”

Interestingly, Seagate sees value in offering a Glacier competitor capable of storing data “intact for decades” and offering access instantly rather than in a matter of hours as Glacier does. As it’s based in Utah I doubt that European government archives would touch it, but it will be interesting to see whether their North American cousins show any interest…

One thing, of course, that most public cloud providers are good at is offering a platform upon which others can build. Archivists, like others, have begun to layer rules, policies, procedures and processes on top of the bare-bones cloud infrastructure offerings, to build something a little more robust and dependable. Services like DuraCloud take AWS and Rackspace (currently only in their US data centres, but that could change), and add things like proactive error checking and even more backups to deliver something that an archivist might be prepared to trust.

There’s a use case here, and there are plenty of (mostly university) archives in the States putting DuraCloud and similar cloud-powered tools to work as part of their preservation strategy.

But I can’t help wondering if some great big enterprise data management solution, with multiply redundant disks, multiply redundant backups and a whole heap of watertight, ironclad, fault tolerant, and ridiculously over-specified policies might be a better (albeit eye-wateringly expensive) way to preserve the truly irreplaceable? Either that, or archives and archivists need to explicitly embrace a more pragmatic approach to what they’re attempting with these systems.

‘Design for failure’ is a core tenet of cloud-powered systems. What’s the archival equivalent? ‘Lose nothing, ever’ just won’t cut it.

Disclaimer: Charles Beagrie is a client. TNA is a client of theirs. This post is not part of the project. Any opinions expressed here are my own, a work in progress… and subject to change!

Image of The National Archives by Flickr user ‘electropod’

Read the original blog entry...

More Stories By Paul Miller

Paul Miller works at the interface between the worlds of Cloud Computing and the Semantic Web, providing the insights that enable you to exploit the next wave as we approach the World Wide Database.

He blogs at www.cloudofdata.com.

@CloudExpo Stories
"Space Monkey by Vivent Smart Home is a product that is a distributed cloud-based edge storage network. Vivent Smart Home, our parent company, is a smart home provider that places a lot of hard drives across homes in North America," explained JT Olds, Director of Engineering, and Brandon Crowfeather, Product Manager, at Vivint Smart Home, in this SYS-CON.tv interview at @ThingsExpo, held Oct 31 – Nov 2, 2017, at the Santa Clara Convention Center in Santa Clara, CA.
SYS-CON Events announced today that Conference Guru has been named “Media Sponsor” of the 22nd International Cloud Expo, which will take place on June 5-7, 2018, at the Javits Center in New York, NY. A valuable conference experience generates new contacts, sales leads, potential strategic partners and potential investors; helps gather competitive intelligence and even provides inspiration for new products and services. Conference Guru works with conference organizers to pass great deals to gre...
DevOps is under attack because developers don’t want to mess with infrastructure. They will happily own their code into production, but want to use platforms instead of raw automation. That’s changing the landscape that we understand as DevOps with both architecture concepts (CloudNative) and process redefinition (SRE). Rob Hirschfeld’s recent work in Kubernetes operations has led to the conclusion that containers and related platforms have changed the way we should be thinking about DevOps and...
In his Opening Keynote at 21st Cloud Expo, John Considine, General Manager of IBM Cloud Infrastructure, led attendees through the exciting evolution of the cloud. He looked at this major disruption from the perspective of technology, business models, and what this means for enterprises of all sizes. John Considine is General Manager of Cloud Infrastructure Services at IBM. In that role he is responsible for leading IBM’s public cloud infrastructure including strategy, development, and offering m...
The next XaaS is CICDaaS. Why? Because CICD saves developers a huge amount of time. CD is an especially great option for projects that require multiple and frequent contributions to be integrated. But… securing CICD best practices is an emerging, essential, yet little understood practice for DevOps teams and their Cloud Service Providers. The only way to get CICD to work in a highly secure environment takes collaboration, patience and persistence. Building CICD in the cloud requires rigorous ar...
Companies are harnessing data in ways we once associated with science fiction. Analysts have access to a plethora of visualization and reporting tools, but considering the vast amount of data businesses collect and limitations of CPUs, end users are forced to design their structures and systems with limitations. Until now. As the cloud toolkit to analyze data has evolved, GPUs have stepped in to massively parallel SQL, visualization and machine learning.
"Evatronix provides design services to companies that need to integrate the IoT technology in their products but they don't necessarily have the expertise, knowledge and design team to do so," explained Adam Morawiec, VP of Business Development at Evatronix, in this SYS-CON.tv interview at @ThingsExpo, held Oct 31 – Nov 2, 2017, at the Santa Clara Convention Center in Santa Clara, CA.
To get the most out of their data, successful companies are not focusing on queries and data lakes, they are actively integrating analytics into their operations with a data-first application development approach. Real-time adjustments to improve revenues, reduce costs, or mitigate risk rely on applications that minimize latency on a variety of data sources. In his session at @BigDataExpo, Jack Norris, Senior Vice President, Data and Applications at MapR Technologies, reviewed best practices to ...
"ZeroStack is a startup in Silicon Valley. We're solving a very interesting problem around bringing public cloud convenience with private cloud control for enterprises and mid-size companies," explained Kamesh Pemmaraju, VP of Product Management at ZeroStack, in this SYS-CON.tv interview at 21st Cloud Expo, held Oct 31 – Nov 2, 2017, at the Santa Clara Convention Center in Santa Clara, CA.
Large industrial manufacturing organizations are adopting the agile principles of cloud software companies. The industrial manufacturing development process has not scaled over time. Now that design CAD teams are geographically distributed, centralizing their work is key. With large multi-gigabyte projects, outdated tools have stifled industrial team agility, time-to-market milestones, and impacted P&L stakeholders.
Enterprises are adopting Kubernetes to accelerate the development and the delivery of cloud-native applications. However, sharing a Kubernetes cluster between members of the same team can be challenging. And, sharing clusters across multiple teams is even harder. Kubernetes offers several constructs to help implement segmentation and isolation. However, these primitives can be complex to understand and apply. As a result, it’s becoming common for enterprises to end up with several clusters. Thi...
"Infoblox does DNS, DHCP and IP address management for not only enterprise networks but cloud networks as well. Customers are looking for a single platform that can extend not only in their private enterprise environment but private cloud, public cloud, tracking all the IP space and everything that is going on in that environment," explained Steve Salo, Principal Systems Engineer at Infoblox, in this SYS-CON.tv interview at 21st Cloud Expo, held Oct 31 – Nov 2, 2017, at the Santa Clara Conventio...
In his session at 21st Cloud Expo, James Henry, Co-CEO/CTO of Calgary Scientific Inc., introduced you to the challenges, solutions and benefits of training AI systems to solve visual problems with an emphasis on improving AIs with continuous training in the field. He explored applications in several industries and discussed technologies that allow the deployment of advanced visualization solutions to the cloud.
The question before companies today is not whether to become intelligent, it’s a question of how and how fast. The key is to adopt and deploy an intelligent application strategy while simultaneously preparing to scale that intelligence. In her session at 21st Cloud Expo, Sangeeta Chakraborty, Chief Customer Officer at Ayasdi, provided a tactical framework to become a truly intelligent enterprise, including how to identify the right applications for AI, how to build a Center of Excellence to oper...
"IBM is really all in on blockchain. We take a look at sort of the history of blockchain ledger technologies. It started out with bitcoin, Ethereum, and IBM evaluated these particular blockchain technologies and found they were anonymous and permissionless and that many companies were looking for permissioned blockchain," stated René Bostic, Technical VP of the IBM Cloud Unit in North America, in this SYS-CON.tv interview at 21st Cloud Expo, held Oct 31 – Nov 2, 2017, at the Santa Clara Conventi...
In his session at 21st Cloud Expo, Carl J. Levine, Senior Technical Evangelist for NS1, will objectively discuss how DNS is used to solve Digital Transformation challenges in large SaaS applications, CDNs, AdTech platforms, and other demanding use cases. Carl J. Levine is the Senior Technical Evangelist for NS1. A veteran of the Internet Infrastructure space, he has over a decade of experience with startups, networking protocols and Internet infrastructure, combined with the unique ability to it...
22nd International Cloud Expo, taking place June 5-7, 2018, at the Javits Center in New York City, NY, and co-located with the 1st DXWorld Expo will feature technical sessions from a rock star conference faculty and the leading industry players in the world. Cloud computing is now being embraced by a majority of enterprises of all sizes. Yesterday's debate about public vs. private has transformed into the reality of hybrid cloud: a recent survey shows that 74% of enterprises have a hybrid cloud ...
"Cloud Academy is an enterprise training platform for the cloud, specifically public clouds. We offer guided learning experiences on AWS, Azure, Google Cloud and all the surrounding methodologies and technologies that you need to know and your teams need to know in order to leverage the full benefits of the cloud," explained Alex Brower, VP of Marketing at Cloud Academy, in this SYS-CON.tv interview at 21st Cloud Expo, held Oct 31 – Nov 2, 2017, at the Santa Clara Convention Center in Santa Clar...
Gemini is Yahoo’s native and search advertising platform. To ensure the quality of a complex distributed system that spans multiple products and components and across various desktop websites and mobile app and web experiences – both Yahoo owned and operated and third-party syndication (supply), with complex interaction with more than a billion users and numerous advertisers globally (demand) – it becomes imperative to automate a set of end-to-end tests 24x7 to detect bugs and regression. In th...
Agile has finally jumped the technology shark, expanding outside the software world. Enterprises are now increasingly adopting Agile practices across their organizations in order to successfully navigate the disruptive waters that threaten to drown them. In our quest for establishing change as a core competency in our organizations, this business-centric notion of Agile is an essential component of Agile Digital Transformation. In the years since the publication of the Agile Manifesto, the conn...