Click here to close now.

Welcome!

CloudExpo® Blog Authors: Pat Romanski, Elizabeth White, Lori MacVittie, Carmen Gonzalez, Roger Strukhoff

News Feed Item

Quantum's New Scalar i6000 HD Stores 5 PB in a Single Rack and Scales to Over 75 PB

Delivers Industry's Best Tape Slot Density for Large-Scale Archiving and Long-Term Retention of Big Data

SAN JOSE, CA -- (Marketwire) -- 01/30/13 -- Quantum Corp. (NYSE: QTM), a proven global expert in data protection and big data management, today announced the new Scalar i6000 HD enterprise tape library, providing the industry's highest slot density. Designed to address customers' big data and archive needs with slot densities that are twice those offered by competitors, this new library makes nearly 5 PB of data available in a single 19" rack and scales to more than 75 PB of capacity. Quantum has also added new high-availability and management features to the Scalar i6000, as well as greater access capabilities for storing big data archives on tape.

In addition to best-in-class slot density and scalability, the Scalar i6000 HD delivers high performance and availability, with new active-active dual robotics for fast data access times. Scheduled to be available in March and backwards compatible with Quantum's Scalar i2000 systems, the Scalar i6000 HD offers a comprehensive suite of capabilities, including proactive diagnostics and many redundant systems, such as power, network ports for encryption key management and connectivity for multi-fabric SAN architectures. Leveraging its embedded iLayer™ software, the Scalar i6000 HD also includes Quantum's unique Active Vault technology for storing vaulted tapes securely in the library and automated policy-based integrity checking, delivering the richest feature set for management of massive data archives.

Tape's New Role in Managing Big Data
As the capture and monetization of data grows, so does the role of tape in big data environments. Built to be reliable and inexpensive, tape technology now serves as an effective tier for storing large-scale archives behind disk storage and next-generation object storage systems. To manage data between tiers of storage intelligently, Quantum's StorNext AEL6000 Archive uses integrated, policy-based software to migrate files automatically. The StorNext AEL6000 Archives will incorporate the new high-density capabilities in the next quarter, enabling StorNext® customers to benefit from the greater tape slot density, smaller footprint, and increased capacity levels as well.

In addition, Quantum's Scalar LTFS provides Scalar i6000 users with greater flexibility for accessing big data archives on tape. Available now, Scalar LTFS provides an easy-to-use NAS file system presentation for tape users in industries working with big data, such as media and entertainment and life sciences. For example, Biola University has deployed Scalar LTFS in its workflow as a flexible and cost-effective approach to storing and accessing large video archives.

Supporting Quotes
Alex Rodriguez, VP of System Engineering and Product Development, Expedient
"The Scalar platform enables us to easily license additional slots and not have to worry about significant fixed asset costs. This flexibility plus the reliable performance of the Scalar i6000 makes it a truly attractive tape library for us. Additionally, Quantum's introduction of high density capabilities addresses the challenges of managing data growth in a service provider environment where efficient use of floor space is critical."

Robert Amatruda, research director, Data Protection and Recovery, IDC
"Today's announcement demonstrates Quantum's continued focus on extending its tape leadership in the market. The latest density, scalability and redundancy features added to the Scalar i6000 tape libraries make it an ideal solution for managing big data archives and cost effectively maintaining data stored in the cloud."

Robert Clark, senior vice president, Data Protection, Quantum
"85 percent of the Fortune 100 have turned to Quantum to meet their storage needs, and we're continuing to deliver innovative new solutions to address the challenges that large enterprise customers face. This includes adding new offerings such as the Scalar i6000 HD to our tape automation portfolio, which not only is unmatched in its breadth and depth but also reflects the unique benefits tape technology provides in today's evolving data protection environment."

Additional Resources

About Quantum
Quantum is a proven global expert in data protection and big data management, providing specialized storage solutions for physical, virtual and cloud environments. From small businesses to major enterprises, more than 100,000 customers have trusted Quantum to help maximize the value of their data by protecting and preserving it over its entire lifecycle. With Quantum, customers can Be Certain™ they're able to adapt in a changing world -- keeping more data longer, bridging from today to tomorrow, and reducing costs. See how at www.quantum.com/BeCertain.

Quantum, the Quantum logo, Be Certain, iLayer, Scalar and StorNext are either registered trademarks or trademarks of Quantum Corporation and its affiliates in the United States and/or other countries. All other trademarks are the property of their respective owners.

"Safe Harbor" Statement: This press release contains "forward-looking" statements. All statements other than statements of historical fact are statements that could be deemed forward-looking statements. Specifically, but without limitation, statements relating to 1) customer benefits and value to customers from using Quantum's Scalar i6000 libraries (including the Scalar i6000 HD libraries), Scalar LTFS and StorNext AEL6000, 2) the future availability of the Scalar i6000 HD libraries and 3) customer demand for and Quantum's future revenue from such libraries and appliances are forward-looking statements within the meaning of the Safe Harbor. All forward-looking statements in this press release are based on information available to Quantum on the date hereof. These statements involve known and unknown risks, uncertainties and other factors that may cause Quantum's actual results to differ materially from those implied by the forward-looking statements. These risks include operational difficulties, unforeseen technical limitations, unexpected changes in market conditions and unanticipated changes in customers' needs or requirements, as well as the risks set forth in Quantum's periodic filings with the Securities and Exchange Commission, including, but not limited to, those risks and uncertainties listed in the section entitled "Risk Factors," in Quantum's Quarterly Report on Form 10-Q filed with the Securities and Exchange Commission on November 9, 2012 and Quantum's Annual Report on Form 10-K filed with the Securities and Exchange Commission on June 14, 2012, especially those risks listed in this section under the heading "Our operating results depend on a limited number of products and on new product introductions, which may not be successful, in which case our business, financial condition and operating results may be materially and adversely affected." Quantum expressly disclaims any obligation to update or alter its forward-looking statements, whether as a result of new information, future events or otherwise.

More Stories By Marketwired .

Copyright © 2009 Marketwired. All rights reserved. All the news releases provided by Marketwired are copyrighted. Any forms of copying other than an individual user's personal reference without express written permission is prohibited. Further distribution of these materials is strictly forbidden, including but not limited to, posting, emailing, faxing, archiving in a public database, redistributing via a computer network or in a printed form.

@CloudExpo Stories
You use an agile process; your goal is to make your organization more agile. But what about your data infrastructure? The truth is, today's databases are anything but agile - they are effectively static repositories that are cumbersome to work with, difficult to change, and cannot keep pace with application demands. Performance suffers as a result, and it takes far longer than it should to deliver new features and capabilities needed to make your organization competitive. As your application an...
Want to enable self-service provisioning of application environments in minutes that mirror production? Can you automatically provide rich data with code-level detail back to the developers when issues occur in production? In his session at DevOps Summit, David Tesar, Microsoft Technical Evangelist on Microsoft Azure and DevOps, will discuss how to accomplish this and more utilizing technologies such as Microsoft Azure, Visual Studio online, and Application Insights in this demo-heavy session.
As cloud gives an opportunity to businesses to buy services externally – how is cloud impacting your customers? In his General Session at 15th Cloud Expo, Fabio Gori, Director of Worldwide Cloud Marketing at Cisco, provided answers to big questions: Do you see hybrid cloud as where the world is going? What benefits does it bring? And how does Cisco connect all of these clouds? He also discussed Intercloud and Cisco’s investment on it.
The Internet of Things is not new. Historically, smart businesses have used its basic concept of leveraging data to drive better decision making and have capitalized on those insights to realize additional revenue opportunities. So, what has changed to make the Internet of Things one of the hottest topics in tech? In his session at @ThingsExpo, Chris Gray, Director, Embedded and Internet of Things, discussed the underlying factors that are driving the economics of intelligent systems. Discover ...
Health care systems across the globe are under enormous strain, as facilities reach capacity and costs continue to rise. M2M and the Internet of Things have the potential to transform the industry through connected health solutions that can make care more efficient while reducing costs. In fact, Vodafone's annual M2M Barometer Report forecasts M2M applications rising to 57 percent in health care and life sciences by 2016. Lively is one of Vodafone's health care partners, whose solutions enable o...
In her General Session at 15th Cloud Expo, Anne Plese, Senior Consultant, Cloud Product Marketing, at Verizon Enterprise, focused on finding the right mix of renting vs. buying Oracle capacity to scale to meet business demands, and offer validated Oracle database TCO models for Oracle development and testing environments. Anne Plese is a marketing and technology enthusiast/realist with over 19+ years in high tech. At Verizon Enterprise, she focuses on driving growth for the Verizon Cloud platfo...
Andi Mann has been serving as Conference Chair of the DevOps Summit since its inception. He is one of the world's recognized leaders in DevOps, and continues to be one of its most articulate advocates. Here are some recent thoughts of his in an interview we conducted in the run-up to the DevOps Summit to be held June 9-11 at the Javits Center in New York City. When did you first start thinking about DevOps and its potential impact on enterprise IT? Andi: I first started thinking about DevOps b...
The most often asked question post-DevOps introduction is: “How do I get started?” There’s plenty of information on why DevOps is valid and important, but many managers still struggle with simple basics for how to initiate a DevOps program in their business. They struggle with issues related to current organizational inertia, the lack of experience on Continuous Integration/Delivery, understanding where DevOps will affect revenue and budget, etc. In their session at DevOps Summit, JP Morgentha...
In a recent research, analyst firm IDC found that the average cost of a critical application failure is $500,000 to $1 million per hour and the average total cost of unplanned application downtime is $1.25 billion to $2.5 billion per year for Fortune 1000 companies. In addition to the findings on the cost of the downtime, the research also highlighted best practices for development, testing, application support, infrastructure, and operations teams.
Software is eating the world. Companies that were not previously in the technology space now find themselves competing with Google and Amazon on speed of innovation. As the innovation cycle accelerates, companies must embrace rapid and constant change to both applications and their infrastructure, and find a way to deliver speed and agility of development without sacrificing reliability or efficiency of operations. In her Day 2 Keynote DevOps Summit, Victoria Livschitz, CEO of Qubell, discussed...
The speed of product development has increased massively in the past 10 years. At the same time our formal secure development and SDL methodologies have fallen behind. This forces product developers to choose between rapid release times and security. In his session at DevOps Summit, Michael Murray, Director of Cyber Security Consulting and Assessment at GE Healthcare, examined the problems and presented some solutions for moving security into the DevOps lifecycle to ensure that we get fast AND ...
SYS-CON Events announced today that MetraTech, now part of Ericsson, has been named “Silver Sponsor” of SYS-CON's 16th International Cloud Expo®, which will take place on June 9–11, 2015, at the Javits Center in New York, NY. Ericsson is the driving force behind the Networked Society- a world leader in communications infrastructure, software and services. Some 40% of the world’s mobile traffic runs through networks Ericsson has supplied, serving more than 2.5 billion subscribers.
The OpenStack cloud operating system includes Trove, a database abstraction layer. Rather than applications connecting directly to a specific type of database, they connect to Trove, which in turn connects to one or more specific databases. One target database is Postgres Plus Cloud Database, which includes its own RESTful API. Trove was originally developed around MySQL, whose interfaces are significantly less complicated than those of the Postgres cloud database. In his session at 16th Cloud...
How does one bridge the gap between traditional enterprise storage infrastructures and the private, hybrid, and public cloud? In his session at 15th Cloud Expo, Dan Pollack, Chief Architect of Storage Operations at AOL Inc., examed the workload differences and required changes to reuse existing knowledge and components when building and using a cloud infrastructure. He also looked into the operational considerations, tool requirements, and behavioral changes required for private cloud storage s...
In their general session at 16th Cloud Expo, Michael Piccininni, Global Account Manager – Cloud SP at EMC Corporation, and Mike Dietze, Regional Director at Windstream Hosted Solutions, will review next generation cloud services, including the Windstream-EMC Tier Storage solutions, and discuss how to increase efficiencies, improve service delivery and enhance corporate cloud solution development. Speaker Bios Michael Piccininni is Global Account Manager – Cloud SP at EMC Corporation. He has b...
Explosive growth in connected devices. Enormous amounts of data for collection and analysis. Critical use of data for split-second decision making and actionable information. All three are factors in making the Internet of Things a reality. Yet, any one factor would have an IT organization pondering its infrastructure strategy. How should your organization enhance its IT framework to enable an Internet of Things implementation? In this session, James Kirkland, Red Hat's Chief Architect for the ...
While there are hundreds of public and private cloud hosting providers to choose from, not all clouds are created equal. If you’re seeking to host enterprise-level mission-critical applications, where Cloud Security is a primary concern, WHOA.com is setting new standards for cloud hosting, and has established itself as a major contender in the marketplace. We are constantly seeking ways to innovate and leverage state-of-the-art technologies. In his session at 16th Cloud Expo, Mike Rivera, Seni...
Hardware will never be more valuable than on the day it hits your loading dock. Each day new servers are not deployed to production the business is losing money. While Moore's Law is typically cited to explain the exponential density growth of chips, a critical consequence of this is rapid depreciation of servers. The hardware for clustered systems (e.g., Hadoop, OpenStack) tends to be significant capital expenses. In his session at Big Data Expo, Mason Katz, CTO and co-founder of StackIQ, disc...
There is no question that the cloud is where businesses want to host data. Until recently hypervisor virtualization was the most widely used method in cloud computing. Recently virtual containers have been gaining in popularity, and for good reason. In the debate between virtual machines and containers, the latter have been seen as the new kid on the block – and like other emerging technology have had some initial shortcomings. However, the container space has evolved drastically since coming on...
The 4th International Internet of @ThingsExpo, co-located with the 17th International Cloud Expo - to be held November 3-5, 2015, at the Santa Clara Convention Center in Santa Clara, CA - announces that its Call for Papers is open. The Internet of Things (IoT) is the biggest idea since the creation of the Worldwide Web more than 20 years ago.