Welcome!

@CloudExpo Authors: Nate Vickery, Elizabeth White, Gopala Krishna Behara, Sridhar Chalasani, Tirumala Khandrika

Related Topics: @CloudExpo, Java IoT, Microservices Expo, Containers Expo Blog, Agile Computing, Apache

@CloudExpo: Blog Feed Post

Amazon Delivers Cloud Archive Storage with Glacier

Glacier enables AWS customers to store their long-term retention data within Amazon’s existing data centres at a very low cost

At the end of August 2012, Amazon Web Services released their latest service offering – a long-term archive service called Glacier.  As a complement to their existing active data access service S3, Glacier provides long term storage for “cold” data – information that has to be retained for a long time but doesn’t require frequent access.

What Exactly is Glacier?

Many organisations need to retain data in archive format for extended periods of time.  This is for regulatory or compliance purposes or may simply be part of their normal business process.  Good examples are medical, healthcare, financial or media (video and audio) data.  Typically for many IT departments, backup has provided a lazy way of archiving information.  Access to backups retained for up to 10 years provides a cheap and rudimentary archive service.  However backup isn’t archive (see my recent article on the subject) as an archive provides additional features around data management and security.  Glacier enables AWS customers to store their long-term retention data within Amazon’s existing data centres at a very low cost, starting at $0.01/GB per month.  The low cost is tempered with rather leisurely access times of between 3-5 hours for data retrieval.

Within Glacier, data is stored in vaults.  Up to 1000 vaults may be created per AWS region, with each vault providing individual security credentials via Amazon’s IAM (Identity and Access Management) service.  Within a vault, data is stored in an archive, which consists of one or more files.  Obviously if multiple files need to be stored together for consistency purposes then they can be stored as a single archive.  An unlimited number of archive files can be created, with a limit of 40TB on any single archive file itself.

Data uploaded to Glacier is stored using AES-256 encryption, managed by AWS.  Customers requiring their own encryption are advised to pre-encrypt their data before upload.

Amazon are claiming a data “durability” level of 99.999999999% per archive although I’m not really sure how they define the term “durability” and exactly what that means in terms of data loss.

As mentioned earlier, data retrieval is between 3-5 hours per archive.  Retrieval requests (or jobs as they are known in Glacier) are queued asynchronously and can be notified once complete via AWS SNS (Simple Notification Service).  Once data is retrieved, it is available for access to the customer for 24 hours.  The long retrieval time implies that the majority of Glacier data is stored on tape, with retrieval resulting in a copy to disk for general access.  Based on the costs, this also makes sense.

The Charging Model

Charging for Glacier is more complex than the other AWS offerings and includes the following components:

  • $0.01/GB/month for storage of data
  • Data upload – no charge for data volume
  • Upload and retrieval requests – $0.05 per 1000 requests
  • Archive query commands (list vault contents, get job status, delete objects) – no charge
  • Data retrieval – 5% of archive per month for free, $0.011/GB upwards after that
  • Data out (moving data outside an AWS region) – $0.12 – $0.05/GB dependent on volume
  • Moving data to EC2 – no charge
  • Deletion of data less than 90 days old – $0.033/GB

It’s interesting that there is a charge for deleting new data, presumably to encourage users to use the service for the purpose it was intended.  In addition, only 5% of the archive can be retrieved per month without incurring costs (although data out incurs a cost), however there are no costs for transferring data to EC2.  This creates an ecosystem that encourages data to be kept in Glacier, using EC2 as the indexing and search or refresh mechanism.

What’s Not Included

Glacier itself is simply a large storage vault for data.  All objects are stored using 138-byte character keys.  Data access is managed via REST-based APIs that can also be developed using pre-coded Java and .NET SDKs.  This means there are no facilities within Glacier for providing some of the most fundamental parts of an archive – notably metadata and indexing capabilities.  These need to be developed by the user themselves and as yet I haven’t found anyone offering services that use Glacier as their storage platform.  There are a few little quirks to bear in mind too.  For instance, vaults are inventoried on a daily basis, so could be inconsistent with any external index the user creates.

The Architect’s View

Amazon have provided a framework and storage repository that could be used by many organisations to store their data over the long term.  This does not mean that tape is dead – far from it – Glacier itself is certainly using tape technology.  What Amazon are providing is a data store against which 3rd party developers can create their own archive solutions in a similar way to that being used for S3 (think Nasuni or Jungledisk).  There are already many other cloud archiving solutions available today (see the same recent article) and on its own Glacier doesn’t represent direct competition, but rather provides another storage platform in which data can be stored.  However there are a few things to consider when using a Glacier-based service;

  • The indexing of data is purely based on any 3rd party vendor’s indexing system or needs to be managed by the end user
  • Taking data out of the archive to move elsewhere will incur a cost
  • Refreshing data within the archive will incur a cost

Glacier and the supporting services could therefore represent a significant and unexpected lock-in for customers.

Overall, Glacier does provide a framework against which developers can create new services for archive and that’s a good thing.  Cost will be a significant factor for many and the marketing-set price of $0.01/GB/month certainly sounds attractive.  Like the other AWS offerings, I’m sure Glacier will be very successful.

 

Related Links

Read the original blog entry...

Comments (0)

Share your thoughts on this story.

Add your comment
You must be signed in to add a comment. Sign-in | Register

In accordance with our Comment Policy, we encourage comments that are on topic, relevant and to-the-point. We will remove comments that include profanity, personal attacks, racial slurs, threats of violence, or other inappropriate material that violates our Terms and Conditions, and will block users who make repeated violations. We ask all readers to expect diversity of opinion and to treat one another with dignity and respect.


@CloudExpo Stories
The past few years have brought a sea change in the way applications are architected, developed, and consumed—increasing both the complexity of testing and the business impact of software failures. How can software testing professionals keep pace with modern application delivery, given the trends that impact both architectures (cloud, microservices, and APIs) and processes (DevOps, agile, and continuous delivery)? This is where continuous testing comes in. D
In his Opening Keynote at 21st Cloud Expo, John Considine, General Manager of IBM Cloud Infrastructure, led attendees through the exciting evolution of the cloud. He looked at this major disruption from the perspective of technology, business models, and what this means for enterprises of all sizes. John Considine is General Manager of Cloud Infrastructure Services at IBM. In that role he is responsible for leading IBM’s public cloud infrastructure including strategy, development, and offering m...
Mobile device usage has increased exponentially during the past several years, as consumers rely on handhelds for everything from news and weather to banking and purchases. What can we expect in the next few years? The way in which we interact with our devices will fundamentally change, as businesses leverage Artificial Intelligence. We already see this taking shape as businesses leverage AI for cost savings and customer responsiveness. This trend will continue, as AI is used for more sophistica...
The “Digital Era” is forcing us to engage with new methods to build, operate and maintain applications. This transformation also implies an evolution to more and more intelligent applications to better engage with the customers, while creating significant market differentiators. In both cases, the cloud has become a key enabler to embrace this digital revolution. So, moving to the cloud is no longer the question; the new questions are HOW and WHEN. To make this equation even more complex, most ...
In his session at 21st Cloud Expo, Raju Shreewastava, founder of Big Data Trunk, provided a fun and simple way to introduce Machine Leaning to anyone and everyone. He solved a machine learning problem and demonstrated an easy way to be able to do machine learning without even coding. Raju Shreewastava is the founder of Big Data Trunk (www.BigDataTrunk.com), a Big Data Training and consulting firm with offices in the United States. He previously led the data warehouse/business intelligence and B...
Blockchain is a shared, secure record of exchange that establishes trust, accountability and transparency across business networks. Supported by the Linux Foundation's open source, open-standards based Hyperledger Project, Blockchain has the potential to improve regulatory compliance, reduce cost as well as advance trade. Are you curious about how Blockchain is built for business? In her session at 21st Cloud Expo, René Bostic, Technical VP of the IBM Cloud Unit in North America, discussed the b...
SYS-CON Events announced today that Synametrics Technologies will exhibit at SYS-CON's 22nd International Cloud Expo®, which will take place on June 5-7, 2018, at the Javits Center in New York, NY. Synametrics Technologies is a privately held company based in Plainsboro, New Jersey that has been providing solutions for the developer community since 1997. Based on the success of its initial product offerings such as WinSQL, Xeams, SynaMan and Syncrify, Synametrics continues to create and hone in...
With tough new regulations coming to Europe on data privacy in May 2018, Calligo will explain why in reality the effect is global and transforms how you consider critical data. EU GDPR fundamentally rewrites the rules for cloud, Big Data and IoT. In his session at 21st Cloud Expo, Adam Ryan, Vice President and General Manager EMEA at Calligo, examined the regulations and provided insight on how it affects technology, challenges the established rules and will usher in new levels of diligence arou...
As you move to the cloud, your network should be efficient, secure, and easy to manage. An enterprise adopting a hybrid or public cloud needs systems and tools that provide: Agility: ability to deliver applications and services faster, even in complex hybrid environments Easier manageability: enable reliable connectivity with complete oversight as the data center network evolves Greater efficiency: eliminate wasted effort while reducing errors and optimize asset utilization Security: imple...
Nordstrom is transforming the way that they do business and the cloud is the key to enabling speed and hyper personalized customer experiences. In his session at 21st Cloud Expo, Ken Schow, VP of Engineering at Nordstrom, discussed some of the key learnings and common pitfalls of large enterprises moving to the cloud. This includes strategies around choosing a cloud provider(s), architecture, and lessons learned. In addition, he covered some of the best practices for structured team migration an...
Companies are harnessing data in ways we once associated with science fiction. Analysts have access to a plethora of visualization and reporting tools, but considering the vast amount of data businesses collect and limitations of CPUs, end users are forced to design their structures and systems with limitations. Until now. As the cloud toolkit to analyze data has evolved, GPUs have stepped in to massively parallel SQL, visualization and machine learning.
The 22nd International Cloud Expo | 1st DXWorld Expo has announced that its Call for Papers is open. Cloud Expo | DXWorld Expo, to be held June 5-7, 2018, at the Javits Center in New York, NY, brings together Cloud Computing, Digital Transformation, Big Data, Internet of Things, DevOps, Machine Learning and WebRTC to one location. With cloud computing driving a higher percentage of enterprise IT budgets every year, it becomes increasingly important to plant your flag in this fast-expanding busin...
Modern software design has fundamentally changed how we manage applications, causing many to turn to containers as the new virtual machine for resource management. As container adoption grows beyond stateless applications to stateful workloads, the need for persistent storage is foundational - something customers routinely cite as a top pain point. In his session at @DevOpsSummit at 21st Cloud Expo, Bill Borsari, Head of Systems Engineering at Datera, explored how organizations can reap the bene...
Kubernetes is an open source system for automating deployment, scaling, and management of containerized applications. Kubernetes was originally built by Google, leveraging years of experience with managing container workloads, and is now a Cloud Native Compute Foundation (CNCF) project. Kubernetes has been widely adopted by the community, supported on all major public and private cloud providers, and is gaining rapid adoption in enterprises. However, Kubernetes may seem intimidating and complex ...
In his session at 21st Cloud Expo, Michael Burley, a Senior Business Development Executive in IT Services at NetApp, described how NetApp designed a three-year program of work to migrate 25PB of a major telco's enterprise data to a new STaaS platform, and then secured a long-term contract to manage and operate the platform. This significant program blended the best of NetApp’s solutions and services capabilities to enable this telco’s successful adoption of private cloud storage and launching ...
In his general session at 21st Cloud Expo, Greg Dumas, Calligo’s Vice President and G.M. of US operations, discussed the new Global Data Protection Regulation and how Calligo can help business stay compliant in digitally globalized world. Greg Dumas is Calligo's Vice President and G.M. of US operations. Calligo is an established service provider that provides an innovative platform for trusted cloud solutions. Calligo’s customers are typically most concerned about GDPR compliance, application p...
In a recent survey, Sumo Logic surveyed 1,500 customers who employ cloud services such as Amazon Web Services (AWS), Microsoft Azure, and Google Cloud Platform (GCP). According to the survey, a quarter of the respondents have already deployed Docker containers and nearly as many (23 percent) are employing the AWS Lambda serverless computing framework. It’s clear: serverless is here to stay. The adoption does come with some needed changes, within both application development and operations. Tha...
Digital transformation is about embracing digital technologies into a company's culture to better connect with its customers, automate processes, create better tools, enter new markets, etc. Such a transformation requires continuous orchestration across teams and an environment based on open collaboration and daily experiments. In his session at 21st Cloud Expo, Alex Casalboni, Technical (Cloud) Evangelist at Cloud Academy, explored and discussed the most urgent unsolved challenges to achieve f...
You know you need the cloud, but you’re hesitant to simply dump everything at Amazon since you know that not all workloads are suitable for cloud. You know that you want the kind of ease of use and scalability that you get with public cloud, but your applications are architected in a way that makes the public cloud a non-starter. You’re looking at private cloud solutions based on hyperconverged infrastructure, but you’re concerned with the limits inherent in those technologies.
Smart cities have the potential to change our lives at so many levels for citizens: less pollution, reduced parking obstacles, better health, education and more energy savings. Real-time data streaming and the Internet of Things (IoT) possess the power to turn this vision into a reality. However, most organizations today are building their data infrastructure to focus solely on addressing immediate business needs vs. a platform capable of quickly adapting emerging technologies to address future ...