|By Roman Stanek||
|November 19, 2012 07:00 AM EST||
Inventory levels. Sales results. Negative comments on Facebook. Positive comments on Twitter. Shopping on Amazon. Listening to Pandora. Online search habits. No matter what you call it or what the information describes, it’s all data being collected about you.
Thanks to new technologies like Hadoop, once-unquantifiable data (like Facebook conversations and Tweets) can now be quantified. Now, because nearly everything is measurable, everything is measured. The result: companies are spending big dollars to collect, store and measure astronomical amounts of data.
Show me the data!
There’s a name for this movement: Big Data. Not only is it a name, it has been the “it, it” of 2012, possibly trumping “the cloud.”
IDC defines Big Data as projects collecting 100 terabytes of data (hence the name), comprising two or more data formats. Earlier this year, the research firm predicted the market for Big Data technology and services will reach $16.9 billion by 2015, from $3.2 billion in 2010. That’s an astounding 40 percent annual growth rate.
The interesting thing is that IDC expects most of this spending to focus on infrastructure — the plumbing that enables companies to download, collect and store vast amounts of data.
To me, this is a missed opportunity. Why? We need to focus on unlocking the real business benefits from all this data.
Companies have not yet grasped the business potential of all the data pouring in from hundreds of sources—think apps in the cloud, on-premise partner software and from their own enterprise. In effect, businesses haven’t figured out how to make money from this fire hose of disparate data sources.
My point-of-view is that Big Data’s only real value lies in businesses’ ability to transform data into insight they can act on.
This means enabling sales managers to quickly analyze sales reps’ results, view new contracts lost or signed, and react to how actual performance compares against the plan they set months earlier. Help-desk staff could see how individual customers affect sales and profit, showing them when to go above-and-beyond to retain certain customers while allowing low-flyers to churn. Or helping insurance agents to predict kinds and amounts of damage as hurricanes hurtle toward their region.
Steps to Monetize Big Data
To glean value from Big Data efforts, companies need to embrace the real-time value provided by the cloud. Viewing one’s data in real-time through the lens of cloud computing enables anyone, in any company, to make smart business decisions from the mammoth amounts of data, coming from all over the place.
Therefore, companies looking to monetize Big Data need to take these steps:
Use the cloud: These days businesses can tap into an enormous range of cloud services. They can subscribe to high-performance infrastructure services like Amazon Web Services, rent platforms as a service (comprising hardware, operating systems, storage and network capacity) from salesforce.com, store information in services like Box or automate billings with companies like Zuora. These are just examples.
Companies can also pick and choose from a long list of cloud-based apps to handle business tasks, from customer relationship management and marketing to human resources and financial management. In fact, I would argue that cloud services become the business application suite, eventually displacing behemoth on-premise packages from SAP or Oracle. Emphasis on “eventually,” since few enterprises are ready to jettison their million-dollar investments in Oracle and SAP.
For this reason, I advise companies to:
Start with what’s important: Forget about separate data sources. Data today spews in from hundreds sources, be it sales and customer data from salesforce.com, inventory levels from SAP, logistics information from your suppliers and employee data from Oracle. Companies run into trouble when they start off boiling the ocean, which is why I suggest companies begin with a few sources and then build up from there.
Fortunately, there is a way, thanks to a new generation of application programming interfaces (APIs) that allows more kinds of software, from different software makers, to communicate with each other, regardless of location. As a result, any company, regardless of size, can access the data it needs to make better decisions.
Which is why my next point is:
Make Big Data insight democratic: Five years ago, only executives at very large companies had access to business intelligence tools that culled patterns from data.
The cloud makes everything democratic — not just access to the data itself, but the insight as well, including best practices that don’t require the expertise of a SQL or a MapReduce programmer. The cloud enables anyone, anywhere, to recognize patterns from data and make smart decisions, faster. And that means any business professional, at any company should be able to monetize their Big Data.
When Big Data finally becomes useful to the rest of us, and not just IT wizards, it will take on an even larger role today and into tomorrow.
It’s been proven time and time again that in tech, diversity drives greater innovation, better team productivity and greater profits and market share. So what can we do in our DevOps teams to embrace diversity and help transform the culture of development and operations into a true “DevOps” team? In her session at DevOps Summit, Stefana Muller, Director, Product Management – Continuous Delivery at CA Technologies, will answer that question citing examples, showing how to create opportunities f...
Mar. 4, 2015 07:00 PM EST Reads: 803
Are your applications getting in the way of your business strategy? It’s time to rethink your IT approach. In his session at 16th Cloud Expo, Madhukar Kumar, Vice President, Product Management at Liaison Technologies, will discuss a new data-centric approach to IT that allows your data, not applications, to inform business strategy. By moving away from an application-centric IT model where data integration and analysis are subservient to the constraints of applications, your organization will b...
Mar. 4, 2015 07:00 PM EST Reads: 1,582
As organizations shift toward IT-as-a-service models, the need for managing and protecting data residing across physical, virtual, and now cloud environments grows with it. CommVault can ensure protection &E-Discovery of your data – whether in a private cloud, a Service Provider delivered public cloud, or a hybrid cloud environment – across the heterogeneous enterprise. In his session at 16th Cloud Expo, Randy De Meno, Chief Technologist - Windows Products and Microsoft Partnerships, will disc...
Mar. 4, 2015 07:00 PM EST Reads: 1,000
Analytics is the foundation of smart data and now, with the ability to run Hadoop directly on smart storage systems like Cloudian HyperStore, enterprises will gain huge business advantages in terms of scalability, efficiency and cost savings as they move closer to realizing the potential of the Internet of Things. In his session at 16th Cloud Expo, Paul Turner, technology evangelist and CMO at Cloudian, Inc., will discuss the revolutionary notion that the storage world is transitioning from me...
Mar. 4, 2015 06:00 PM EST Reads: 1,999
VictorOps is making on-call suck less with the only collaborative alert management platform on the market. With easy on-call scheduling management, a real-time incident timeline that gives you contextual relevance around your alerts and powerful reporting features that make post-mortems more effective, VictorOps helps your IT/DevOps team solve problems faster.
Mar. 4, 2015 05:00 PM EST Reads: 1,382
The Software Defined Data Center (SDDC), which enables organizations to seamlessly run in a hybrid cloud model (public + private cloud), is here to stay. IDC estimates that the software-defined networking market will be valued at $3.7 billion by 2016. Security is a key component and benefit of the SDDC, and offers an opportunity to build security 'from the ground up' and weave it into the environment from day one. In his session at 16th Cloud Expo, Reuven Harrison, CTO and Co-Founder of Tufin,...
Mar. 4, 2015 05:00 PM EST Reads: 2,553
The essence of cloud computing is that all consumable IT resources are delivered as services. In his session at 15th Cloud Expo, Yung Chou, Technology Evangelist at Microsoft, will demonstrate the concepts and implementations of two important cloud computing deliveries: Infrastructure as a Service (IaaS) and Platform as a Service (PaaS). He will discuss from business and technical viewpoints what exactly they are, why we care, how they are different and in what ways, and the strategies for IT ...
Mar. 4, 2015 04:45 PM EST Reads: 2,347
Red Hat has launched the Red Hat Cloud Innovation Practice, a new global team of experts that will assist companies with more quickly on-ramping to the cloud. They will do this by providing solutions and services such as validated designs with reference architectures and agile methodology consulting, training, and support. The Red Hat Cloud Innovation Practice is born out of the integration of technology and engineering expertise gained through the company’s 2014 acquisitions of leading Ceph s...
Mar. 4, 2015 04:15 PM EST Reads: 943
The free version of KEMP Technologies' LoadMaster™ application load balancer is now available for unlimited use, making it easy for IT developers and open source technology users to benefit from all the features of a full commercial-grade product at no cost. It can be downloaded at FreeLoadBalancer.com. Load balancing, security and traffic optimization are all key enablers for application performance and functionality. Without these, application services will not perform as expected or have the...
Mar. 4, 2015 04:15 PM EST Reads: 799
Cloud data governance was previously an avoided function when cloud deployments were relatively small. With the rapid adoption in public cloud – both rogue and sanctioned, it’s not uncommon to find regulated data dumped into public cloud and unprotected. This is why enterprises and cloud providers alike need to embrace a cloud data governance function and map policies, processes and technology controls accordingly. In her session at 15th Cloud Expo, Evelyn de Souza, Data Privacy and Compliance...
Mar. 4, 2015 04:15 PM EST Reads: 979
Skeuomorphism usually means retaining existing design cues in something new that doesn’t actually need them. However, the concept of skeuomorphism can be thought of as relating more broadly to applying existing patterns to new technologies that, in fact, cry out for new approaches. In his session at DevOps Summit, Gordon Haff, Senior Cloud Strategy Marketing and Evangelism Manager at Red Hat, will discuss why containers should be paired with new architectural practices such as microservices ra...
Mar. 4, 2015 04:00 PM EST Reads: 1,588
Roberto Medrano, Executive Vice President at SOA Software, had reached 30,000 page views on his home page - http://RobertoMedrano.SYS-CON.com/ - on the SYS-CON family of online magazines, which includes Cloud Computing Journal, Internet of Things Journal, Big Data Journal, and SOA World Magazine. He is a recognized executive in the information technology fields of SOA, internet security, governance, and compliance. He has extensive experience with both start-ups and large companies, having been ...
Mar. 4, 2015 04:00 PM EST Reads: 1,491
There are many considerations when moving applications from on-premise to cloud. It is critical to understand the benefits and also challenges of this migration. A successful migration will result in lower Total Cost of Ownership, yet offer the same or higher level of robustness. In his session at 15th Cloud Expo, Michael Meiner, an Engineering Director at Oracle, Corporation, will analyze a range of cloud offerings (IaaS, PaaS, SaaS) and discuss the benefits/challenges of migrating to each of...
Mar. 4, 2015 04:00 PM EST Reads: 958
Platform-as-a-Service (PaaS) is a technology designed to make DevOps easier and allow developers to focus on application development. The PaaS takes care of provisioning, scaling, HA, and other cloud management aspects. Apache Stratos is a PaaS codebase developed in Apache and designed to create a highly productive developer environment while also supporting powerful deployment options. Integration with the Docker platform, CoreOS Linux distribution, and Kubernetes container management system ...
Mar. 4, 2015 04:00 PM EST Reads: 1,078
Mar. 4, 2015 03:15 PM EST Reads: 2,264
The industrial software market has treated data with the mentality of “collect everything now, worry about how to use it later.” We now find ourselves buried in data, with the pervasive connectivity of the (Industrial) Internet of Things only piling on more numbers. There’s too much data and not enough information. In his session at @ThingsExpo, Bob Gates, Global Marketing Director, GE’s Intelligent Platforms business, to discuss how realizing the power of IoT, software developers are now focu...
Mar. 4, 2015 03:15 PM EST Reads: 1,544
Mar. 4, 2015 03:00 PM EST Reads: 1,889
We certainly live in interesting technological times. And no more interesting than the current competing IoT standards for connectivity. Various standards bodies, approaches, and ecosystems are vying for mindshare and positioning for a competitive edge. It is clear that when the dust settles, we will have new protocols, evolved protocols, that will change the way we interact with devices and infrastructure. We will also have evolved web protocols, like HTTP/2, that will be changing the very core...
Mar. 4, 2015 02:30 PM EST Reads: 3,067
Skytap Inc., has appointed David Frost as vice president of professional services. David joins Skytap from Deloitte Consulting where he served as Managing Director leading SAP, Cloud, and Advanced Technology Services. At Skytap, David will head the company's professional services organization, and spearhead a new consulting practice that will guide IT organizations through the adoption of DevOps best practices. David's appointment comes on the heels of Skytap's recent $35 million Series D fundin...
Mar. 4, 2015 02:30 PM EST Reads: 799
Operational Hadoop and the Lambda Architecture for Streaming Data Apache Hadoop is emerging as a distributed platform for handling large and fast incoming streams of data. Predictive maintenance, supply chain optimization, and Internet-of-Things analysis are examples where Hadoop provides the scalable storage, processing, and analytics platform to gain meaningful insights from granular data that is typically only valuable from a large-scale, aggregate view. One architecture useful for capturing...
Mar. 4, 2015 02:00 PM EST Reads: 1,519