Click here to close now.

Welcome!

Cloud Expo Authors: Elizabeth White, Carmen Gonzalez, David Sprott, Irit Gillath, Clinton Wolfe

Related Topics: Big Data Journal, Java, XML, SOA & WOA, .NET, Cloud Expo

Big Data Journal: Article

Simplified Data Retention on a Massive Scale Speeds Access to Big Data

Organizations can gain competitive advantages when able to rely on data retention for improved decision making & trend analysis

There are numerous applications for cost-effective data retention. Organizations can gain substantial competitive advantages when able to rely on data retention for improved decision making and trend analysis. Research enterprises can make use of large scale data sets enabling them to study information more completely than ever before.

Simplified data retention on a massive scale speeds up access time to Big Data. Big Data is defined as large-scale data sets that are too large to analyze and manage using ordinary methods. This data in both structured and unstructured form is valuable and comes from sources such as trading systems.

In many cases existing systems cannot process data of this variety and volume. Some organizations store such data in file systems so as not to overburden their databases. This may be a temporary stop gap, but it will not suffice in the long run. Because Big Data is increasing at an exponential rate, this is only a temporary solution. It's likely that machine-generated data will exceed the processing capability of conventional systems. The cost of extracting this data can be so high that many organizations will just shy away from it.

Today technology is just beginning to address Big Data issues. Many organizations try to apply existing strategies to manage this data effectively. Standard methods from relational database queries to complex analysis tools are being used. Data retention software is also being applied to extract relevant information from Big Data sources.

Currently Big Data retention technology is available that is scalable and easy to implement. Using this technology it's possible to access Big Data online using SQL along with business intelligence software. Components of this type of system are storage platforms with specialized software and a specialized massive scale data repository developed for data retention online. This unique Big Data management system is scalable and designed to process machine-generated data at 40:11 compression ratios while maintaining its online availability.

Organizations that need to process Big Data may benefit by using databases specifically designed for this purpose. Such databases will prove cost-effective and are currently being used in numerous organizations internationally. Such databases work in parallel allowing tens of billions of records to be processed each day. At the same time, the retention capability is practically limitless. This database can fit content addressable storage (CAS), direct attach storage (DAS), and storage area network (SAN). Some of the benefits of this data storage and retrieval system are reduction in infrastructure through reduction in physical storage demand and effective, configurable record management.

One Big Data retention solution has three components. The first is paired server level service managers that share metadata and provide import and query capability. The second is a data archive residing on a cluster services node as well as storage nodes. It's designed with enough scalability to process billions of objects. The third component consists of shared storage that can be local direct access storage, a network file system or a comprehensive clustered file system.

This type of system was recently tested on 508 GB of artificially generated using stock trading test data, modeled after NASDAQ. Performance test results for data import showed a rate of close to 12 billion records imported within an hour. Data compression resulted in a data reduction of 476.1 GB. The archive data was only about 6.3% of the original size prior to compression. A SQL query was executed selecting the three largest volume stocks having trades of well over 4 million per day. This query against 11.6 billion records took approximately 5.5 seconds to execute.

Big Data is high-volume, high-velocity and perhaps highly variable as well. Big Data retention solutions can lead to better decision making, new discoveries and even process optimization. Science is a major area that can benefit from Big Data solutions. Meteorology is just one example that can reap rewards by using new technological advances in data retention on a massive scale. The ability to do research and analysis with extremely large sets of data gives greater understanding to those who are modeling weather, oceanographic conditions, the economy or social trends. With new cost-effective technology available many new organizations will consider the possibilities of Big Data retention in their enterprise.

More Stories By Alan McMahon

Alan McMahon works for Dell. He has worked for Dell for the past 13 years and is involved in enterprise solution design across a range of products from servers and storage to virtualization. He now focuses his attention on marketing for Dell. He is based in Ireland and enjoys sailing as a past time.

Comments (0)

Share your thoughts on this story.

Add your comment
You must be signed in to add a comment. Sign-in | Register

In accordance with our Comment Policy, we encourage comments that are on topic, relevant and to-the-point. We will remove comments that include profanity, personal attacks, racial slurs, threats of violence, or other inappropriate material that violates our Terms and Conditions, and will block users who make repeated violations. We ask all readers to expect diversity of opinion and to treat one another with dignity and respect.


@CloudExpo Stories
In his session at DevOps Summit, Tapabrata Pal, Director of Enterprise Architecture at Capital One, will tell a story about how Capital One has embraced Agile and DevOps Security practices across the Enterprise – driven by Enterprise Architecture; bringing in Development, Operations and Information Security organizations together. Capital Ones DevOpsSec practice is based upon three "pillars" – Shift-Left, Automate Everything, Dashboard Everything. Within about three years, from 100% waterfall, C...
Disruptive macro trends in technology are impacting and dramatically changing the "art of the possible" relative to supply chain management practices through the innovative use of IoT, cloud, machine learning and Big Data to enable connected ecosystems of engagement. Enterprise informatics can now move beyond point solutions that merely monitor the past and implement integrated enterprise fabrics that enable end-to-end supply chain visibility to improve customer service delivery and optimize sup...
Dale Kim is the Director of Industry Solutions at MapR. His background includes a variety of technical and management roles at information technology companies. While his experience includes work with relational databases, much of his career pertains to non-relational data in the areas of search, content management, and NoSQL, and includes senior roles in technical marketing, sales engineering, and support engineering. Dale holds an MBA from Santa Clara University, and a BA in Computer Science f...
Even as cloud and managed services grow increasingly central to business strategy and performance, challenges remain. The biggest sticking point for companies seeking to capitalize on the cloud is data security. Keeping data safe is an issue in any computing environment, and it has been a focus since the earliest days of the cloud revolution. Understandably so: a lot can go wrong when you allow valuable information to live outside the firewall. Recent revelations about government snooping, along...
The cloud is now a fact of life but generating recurring revenues that are driven by solutions and services on a consumption model have been hard to implement, until now. In their session at 16th Cloud Expo, Ermanno Bonifazi, CEO & Founder of Solgenia, and Ian Khan, Global Strategic Positioning & Brand Manager at Solgenia, will discuss how a top European telco has leveraged the innovative recurring revenue generating capability of the consumption cloud to enable a unique cloud monetization mod...
FedRAMP is mandatory for government cloud deployments and businesses need to comply in order to provide services for federal engagements. In his session at 16th Cloud Expo, Abel Sussman, Director for Coalfire Public Sector practice, will review the Federal Risk and Authorization Management Program (FedRAMP) process and provide advice on overcoming common compliance obstacles.
As organizations shift toward IT-as-a-service models, the need for managing and protecting data residing across physical, virtual, and now cloud environments grows with it. CommVault can ensure protection &E-Discovery of your data – whether in a private cloud, a Service Provider delivered public cloud, or a hybrid cloud environment – across the heterogeneous enterprise. In his session at 16th Cloud Expo, Randy De Meno, Chief Technologist - Windows Products and Microsoft Partnerships, will disc...
Are your applications getting in the way of your business strategy? It’s time to rethink your IT approach. In his session at 16th Cloud Expo, Madhukar Kumar, Vice President, Product Management at Liaison Technologies, will discuss a new data-centric approach to IT that allows your data, not applications, to inform business strategy. By moving away from an application-centric IT model where data integration and analysis are subservient to the constraints of applications, your organization will b...
Docker is an excellent platform for organizations interested in running microservices. It offers portability and consistency between development and production environments, quick provisioning times, and a simple way to isolate services. In his session at DevOps Summit at 16th Cloud Expo, Shannon Williams, co-founder of Rancher Labs, will walk through these and other benefits of using Docker to run microservices, and provide an overview of RancherOS, a minimalist distribution of Linux designed...
Cloud data governance was previously an avoided function when cloud deployments were relatively small. With the rapid adoption in public cloud – both rogue and sanctioned, it’s not uncommon to find regulated data dumped into public cloud and unprotected. This is why enterprises and cloud providers alike need to embrace a cloud data governance function and map policies, processes and technology controls accordingly. In her session at 15th Cloud Expo, Evelyn de Souza, Data Privacy and Compliance...
Analytics is the foundation of smart data and now, with the ability to run Hadoop directly on smart storage systems like Cloudian HyperStore, enterprises will gain huge business advantages in terms of scalability, efficiency and cost savings as they move closer to realizing the potential of the Internet of Things. In his session at 16th Cloud Expo, Paul Turner, technology evangelist and CMO at Cloudian, Inc., will discuss the revolutionary notion that the storage world is transitioning from me...
Red Hat has launched the Red Hat Cloud Innovation Practice, a new global team of experts that will assist companies with more quickly on-ramping to the cloud. They will do this by providing solutions and services such as validated designs with reference architectures and agile methodology consulting, training, and support. The Red Hat Cloud Innovation Practice is born out of the integration of technology and engineering expertise gained through the company’s 2014 acquisitions of leading Ceph s...
IBM has announced that SoftLayer will offer OpenPOWER-based servers as part of its portfolio of cloud-based services. With the new offering, clients will be able to select OpenPOWER-based “bare metal” servers when configuring their cloud-based IT infrastructure from SoftLayer, an IBM company. Leveraging the OpenPOWER Foundation design concept, the servers were developed to help clients better manage data-intensive workloads on public and private clouds. Increasingly cloud technologies, bot...
VictorOps is making on-call suck less with the only collaborative alert management platform on the market. With easy on-call scheduling management, a real-time incident timeline that gives you contextual relevance around your alerts and powerful reporting features that make post-mortems more effective, VictorOps helps your IT/DevOps team solve problems faster.
The Software Defined Data Center (SDDC), which enables organizations to seamlessly run in a hybrid cloud model (public + private cloud), is here to stay. IDC estimates that the software-defined networking market will be valued at $3.7 billion by 2016. Security is a key component and benefit of the SDDC, and offers an opportunity to build security 'from the ground up' and weave it into the environment from day one. In his session at 16th Cloud Expo, Reuven Harrison, CTO and Co-Founder of Tufin,...
The essence of cloud computing is that all consumable IT resources are delivered as services. In his session at 15th Cloud Expo, Yung Chou, Technology Evangelist at Microsoft, will demonstrate the concepts and implementations of two important cloud computing deliveries: Infrastructure as a Service (IaaS) and Platform as a Service (PaaS). He will discuss from business and technical viewpoints what exactly they are, why we care, how they are different and in what ways, and the strategies for IT ...
SYS-CON Media announced today that Symantec, a provider of leading security, backup and availability solutions for where vital information is stored, accessed and shared, has launched new ad campaigns on SYS-CON's i-Technology sites, which include Cloud Computing Journal, DevOps Journal, Virtualization Journal, and IoT Journal. Symantec’s campaigns focus on Disaster Recovery and High Availability, the availability of business-critical applications in today’s complex heterogeneous environments, ...
Cloud data governance was previously an avoided function when cloud deployments were relatively small. With the rapid adoption in public cloud – both rogue and sanctioned, it’s not uncommon to find regulated data dumped into public cloud and unprotected. This is why enterprises and cloud providers alike need to embrace a cloud data governance function and map policies, processes and technology controls accordingly. In her session at 15th Cloud Expo, Evelyn de Souza, Data Privacy and Compliance...
Platform-as-a-Service (PaaS) is a technology designed to make DevOps easier and allow developers to focus on application development. The PaaS takes care of provisioning, scaling, HA, and other cloud management aspects. Apache Stratos is a PaaS codebase developed in Apache and designed to create a highly productive developer environment while also supporting powerful deployment options. Integration with the Docker platform, CoreOS Linux distribution, and Kubernetes container management system ...
Skeuomorphism usually means retaining existing design cues in something new that doesn’t actually need them. However, the concept of skeuomorphism can be thought of as relating more broadly to applying existing patterns to new technologies that, in fact, cry out for new approaches. In his session at DevOps Summit, Gordon Haff, Senior Cloud Strategy Marketing and Evangelism Manager at Red Hat, will discuss why containers should be paired with new architectural practices such as microservices ra...