Click here to close now.

Welcome!

Cloud Expo Authors: Pat Romanski, Carmen Gonzalez, Liz McMillan, Elizabeth White, Cloud Best Practices Network

Related Topics: Cloud Expo, Java, SOA & WOA, Virtualization

Cloud Expo: Article

The Future of Big Data

The true value of Big Data lies in the amount of useful data that can be derived from it

About once every five years or so, the technology industry blazes a new path of innovation. The PC, the Internet, smart mobility and social networking have emerged over the past 20 plus years, delivering new technologies and business ecosystems that have fundamentally changed the world. The latest catalyst is Big Data.

Nearly every major new computing era in the past has had a hot IPO provide a catalyst for more widespread adoption of the shift. The recent Splunk IPO evokes parallels with Netscape, the company that provided the catalyst in 1995 to a wave of Internet computing for both B2C and B2B marketplaces. It ushered in a wave of new innovation and a plethora of new .com businesses. Hundreds of billions of dollars in new value was subsequently created and business environments changed forever.

Big Data refers to the enormous volume, velocity, and variety of data that exists and has the potential to be turned into business value. The challenge of Big Data is taking inhuman amounts of data and distilling it into information that human brains can use. Most businesses accumulate astronomical amounts of data - and the volume is expanding at an alarming rate. According to IDC, the volume of digital content in the world will grow to 2.7 billion terabytes in 2012, up 48% from 2011, and will reach 8 billion terabytes by 2015. [1]

The data flood, of course, comes from both structured corporate databases and unstructured data from Web pages, blogs, social networking messages and other sources. For example, there are now countless digital sensors worldwide in industrial equipment, automobiles, electrical meters and shipping crates. They can measure and communicate location, movement, vibration, temperature, humidity, even chemical changes in the air. Companies wield data like a weapon. Retailers, like Wal-Mart and Kohl's, analyze sales, pricing and economic, demographic and weather data to tailor product selections at particular stores and determine the timing of price markdowns. Logistics companies like UPS mine data on truck delivery times and traffic patterns to fine-tune routing.

Today, a whole ecosystem of new businesses is springing up to engage with this new reality: companies that store data; companies that mine data for insight; and companies that aggregate data to make it manageable. But it's an ecosystem that's still emerging, and its exact shape has yet to make itself clear.

One of the biggest challenges of working with Big Data is assembling it and preparing it for analysis. Different systems store data in different formats, even within the same company. Assembling, standardizing, and cleaning data of irregularities - all without scrubbing it of the information that makes it valuable - is a central challenge of this space.

Of course, Hadoop, an open source software framework derived from Google's Map Reduce and Google File System (GFS) papers, is being leveraged by several technology vendors to do just that. Hadoop maps tasks across a cluster of machines, splitting them into smaller sub-tasks, before reducing the results into one master calculation. It's really an old grid computing technique given new life in the age of cloud computing.

Hadoop is converging with other technology advances such as high-speed data analysis made possible because of parallel computing, in-memory processing, and lower cost flash memory in the form of solid state drives. The prospect of being able to process troves of data very quickly, in memory, without time-consuming forays to retrieve information stored on disk drives, is a big advance that will enable companies to assemble, sort, and analyze data much more rapidly.

For example, T-Mobile is using SAP's HANA to mine data from stores, text messages and call centers on its 30 million U.S. customers to tailor personalized deals. What used to take a week can be done in three hours with the SAP system. Organizations that can leverage this capability to make faster and more informed business decisions will have a distinct advantage over competitors.

In a short period of time, Hadoop has transitioned from relative obscurity as a consumer Internet project into the mainstream consciousness of enterprise IT. Hadoop is designed to handle mountains of unstructured data. However, as it exists, the open source code is a long way from meeting enterprise requirements for security, management, and efficiency without some serious customization. Enterprise-scale Hadoop deployments require costly IT specialists who are capable of guiding a lot of somewhat disjointed processes. That currently limits adoption to organizations with substantial IT budgets.

It will take a refined platform to enable Hadoop and its derivatives to fit into the enterprise as a complement to existing data analytics and data warehousing tools from established business process vendors like Oracle, HP, and SAP. At Zettaset, for example, we are focused on making Hadoop much more accessible to enterprises of all sizes by creating a high availability platform that takes much of the complexity out of assembling and preparing huge amounts of data for analysis. We have aggregated multiple steps into a streamlined automated process, significantly enhanced security, and are now integrating our software into an appliance which can be racked in the data center and easily managed through a user-friendly GUI.

The true value of Big Data lies in the amount of useful data that can be derived from it. The future of Big Data is therefore to do for data and analytics what Moore's Law has done for computing hardware, and exponentially increase the speed and value of business intelligence. Whether it is linking geography and retail availability, using patient data to forecast public health trends, or analyzing global climate trends, we live in a world full of data. Effectively harnessing Big Data will give businesses a whole new lens through which to see it.

Reference

  1. Source: "IDC Predictions 2012: Competing for 2020," December 2011

More Stories By Jim Vogt

With more than 25 years of leadership experience in both start-up and established corporations, Jim Vogt brings a wealth of business and technology expertise to his role as president and CEO of Zettaset. Most recently, he served as senior vice president and general manager of the Cloud Services business unit at Blue Coat Systems. Prior to Blue Coat, he served as president and CEO at Trapeze Networks, which was acquired by Belden, Inc. He was also president and CEO at data encryption start-up Ingrian Networks (acquired in April, 2008 by SafeNet). Prior to his private company posts, Vogt spent 11 years with SynOptics, Bay and Nortel where he held several product line and general management roles, including president of Nortel’s Small Business Solutions group, vice president and general manager of Bay’s workgroup product and distributed network systems divisions, and vice president of product management for Bay’s desktop products group.

Jim holds a BS in electrical engineering from the University of Nevada and an MBA from Santa Clara University.

Comments (0)

Share your thoughts on this story.

Add your comment
You must be signed in to add a comment. Sign-in | Register

In accordance with our Comment Policy, we encourage comments that are on topic, relevant and to-the-point. We will remove comments that include profanity, personal attacks, racial slurs, threats of violence, or other inappropriate material that violates our Terms and Conditions, and will block users who make repeated violations. We ask all readers to expect diversity of opinion and to treat one another with dignity and respect.


@CloudExpo Stories
The free version of KEMP Technologies' LoadMaster™ application load balancer is now available for unlimited use, making it easy for IT developers and open source technology users to benefit from all the features of a full commercial-grade product at no cost. It can be downloaded at FreeLoadBalancer.com. Load balancing, security and traffic optimization are all key enablers for application performance and functionality. Without these, application services will not perform as expected or have the...
Hadoop as a Service (as offered by handful of niche vendors now) is a cloud computing solution that makes medium and large-scale data processing accessible, easy, fast and inexpensive. In his session at Big Data Expo, Kumar Ramamurthy, Vice President and Chief Technologist, EIM & Big Data, at Virtusa, will discuss how this is achieved by eliminating the operational challenges of running Hadoop, so one can focus on business growth. The fragmented Hadoop distribution world and various PaaS soluti...
Business and IT leaders today need better application delivery capabilities to support critical new innovation. But how often do you hear objections to improving application delivery like, “I can harden it against attack, but not on this timeline”; “I can make it better, but it will cost more”; “I can deliver faster, but not with these specs”; or “I can stay strong on cost control, but quality will suffer”? In the new application economy, these tradeoffs are no longer acceptable. Customers will ...
SYS-CON Events announced today that Dyn, the worldwide leader in Internet Performance, will exhibit at SYS-CON's 16th International Cloud Expo®, which will take place on June 9-11, 2015, at the Javits Center in New York City, NY. Dyn is a cloud-based Internet Performance company. Dyn helps companies monitor, control, and optimize online infrastructure for an exceptional end-user experience. Through a world-class network and unrivaled, objective intelligence into Internet conditions, Dyn ensures...
VictorOps is making on-call suck less with the only collaborative alert management platform on the market. With easy on-call scheduling management, a real-time incident timeline that gives you contextual relevance around your alerts and powerful reporting features that make post-mortems more effective, VictorOps helps your IT/DevOps team solve problems faster.
As organizations shift toward IT-as-a-service models, the need for managing and protecting data residing across physical, virtual, and now cloud environments grows with it. CommVault can ensure protection &E-Discovery of your data – whether in a private cloud, a Service Provider delivered public cloud, or a hybrid cloud environment – across the heterogeneous enterprise. In his session at 16th Cloud Expo, Randy De Meno, Chief Technologist - Windows Products and Microsoft Partnerships, will disc...
Skytap Inc., has appointed David Frost as vice president of professional services. David joins Skytap from Deloitte Consulting where he served as Managing Director leading SAP, Cloud, and Advanced Technology Services. At Skytap, David will head the company's professional services organization, and spearhead a new consulting practice that will guide IT organizations through the adoption of DevOps best practices. David's appointment comes on the heels of Skytap's recent $35 million Series D fundin...
Cloud data governance was previously an avoided function when cloud deployments were relatively small. With the rapid adoption in public cloud – both rogue and sanctioned, it’s not uncommon to find regulated data dumped into public cloud and unprotected. This is why enterprises and cloud providers alike need to embrace a cloud data governance function and map policies, processes and technology controls accordingly. In her session at 15th Cloud Expo, Evelyn de Souza, Data Privacy and Compliance...
Even as cloud and managed services grow increasingly central to business strategy and performance, challenges remain. The biggest sticking point for companies seeking to capitalize on the cloud is data security. Keeping data safe is an issue in any computing environment, and it has been a focus since the earliest days of the cloud revolution. Understandably so: a lot can go wrong when you allow valuable information to live outside the firewall. Recent revelations about government snooping, along...
Skeuomorphism usually means retaining existing design cues in something new that doesn’t actually need them. However, the concept of skeuomorphism can be thought of as relating more broadly to applying existing patterns to new technologies that, in fact, cry out for new approaches. In his session at DevOps Summit, Gordon Haff, Senior Cloud Strategy Marketing and Evangelism Manager at Red Hat, will discuss why containers should be paired with new architectural practices such as microservices ra...
Roberto Medrano, Executive Vice President at SOA Software, had reached 30,000 page views on his home page - http://RobertoMedrano.SYS-CON.com/ - on the SYS-CON family of online magazines, which includes Cloud Computing Journal, Internet of Things Journal, Big Data Journal, and SOA World Magazine. He is a recognized executive in the information technology fields of SOA, internet security, governance, and compliance. He has extensive experience with both start-ups and large companies, having been ...
The Workspace-as-a-Service (WaaS) market will grow to $6.4B by 2018. In his session at 16th Cloud Expo, Seth Bostock, CEO of IndependenceIT, will begin by walking the audience through the evolution of Workspace as-a-Service, where it is now vs. where it going. To look beyond the desktop we must understand exactly what WaaS is, who the users are, and where it is going in the future. IT departments, ISVs and service providers must look to workflow and automation capabilities to adapt to growing ...
There are many considerations when moving applications from on-premise to cloud. It is critical to understand the benefits and also challenges of this migration. A successful migration will result in lower Total Cost of Ownership, yet offer the same or higher level of robustness. In his session at 15th Cloud Expo, Michael Meiner, an Engineering Director at Oracle, Corporation, will analyze a range of cloud offerings (IaaS, PaaS, SaaS) and discuss the benefits/challenges of migrating to each of...
Platform-as-a-Service (PaaS) is a technology designed to make DevOps easier and allow developers to focus on application development. The PaaS takes care of provisioning, scaling, HA, and other cloud management aspects. Apache Stratos is a PaaS codebase developed in Apache and designed to create a highly productive developer environment while also supporting powerful deployment options. Integration with the Docker platform, CoreOS Linux distribution, and Kubernetes container management system ...
The industrial software market has treated data with the mentality of “collect everything now, worry about how to use it later.” We now find ourselves buried in data, with the pervasive connectivity of the (Industrial) Internet of Things only piling on more numbers. There’s too much data and not enough information. In his session at @ThingsExpo, Bob Gates, Global Marketing Director, GE’s Intelligent Platforms business, to discuss how realizing the power of IoT, software developers are now focu...
Red Hat has launched the Red Hat Cloud Innovation Practice, a new global team of experts that will assist companies with more quickly on-ramping to the cloud. They will do this by providing solutions and services such as validated designs with reference architectures and agile methodology consulting, training, and support. The Red Hat Cloud Innovation Practice is born out of the integration of technology and engineering expertise gained through the company’s 2014 acquisitions of leading Ceph s...
Operational Hadoop and the Lambda Architecture for Streaming Data Apache Hadoop is emerging as a distributed platform for handling large and fast incoming streams of data. Predictive maintenance, supply chain optimization, and Internet-of-Things analysis are examples where Hadoop provides the scalable storage, processing, and analytics platform to gain meaningful insights from granular data that is typically only valuable from a large-scale, aggregate view. One architecture useful for capturing...
SYS-CON Events announced today that Vitria Technology, Inc. will exhibit at SYS-CON’s @ThingsExpo, which will take place on June 9-11, 2015, at the Javits Center in New York City, NY. Vitria will showcase the company’s new IoT Analytics Platform through live demonstrations at booth #330. Vitria’s IoT Analytics Platform, fully integrated and powered by an operational intelligence engine, enables customers to rapidly build and operationalize advanced analytics to deliver timely business outcomes ...
The Internet of Things (IoT) promises to evolve the way the world does business; however, understanding how to apply it to your company can be a mystery. Most people struggle with understanding the potential business uses or tend to get caught up in the technology, resulting in solutions that fail to meet even minimum business goals. In his session at @ThingsExpo, Jesse Shiah, CEO / President / Co-Founder of AgilePoint Inc., showed what is needed to leverage the IoT to transform your business. ...
DevOps is about increasing efficiency, but nothing is more inefficient than building the same application twice. However, this is a routine occurrence with enterprise applications that need both a rich desktop web interface and strong mobile support. With recent technological advances from Isomorphic Software and others, it is now feasible to create a rich desktop and tuned mobile experience with a single codebase, without compromising performance or usability.