|By Jim Vogt||
|May 11, 2012 10:00 AM EDT||
About once every five years or so, the technology industry blazes a new path of innovation. The PC, the Internet, smart mobility and social networking have emerged over the past 20 plus years, delivering new technologies and business ecosystems that have fundamentally changed the world. The latest catalyst is Big Data.
Nearly every major new computing era in the past has had a hot IPO provide a catalyst for more widespread adoption of the shift. The recent Splunk IPO evokes parallels with Netscape, the company that provided the catalyst in 1995 to a wave of Internet computing for both B2C and B2B marketplaces. It ushered in a wave of new innovation and a plethora of new .com businesses. Hundreds of billions of dollars in new value was subsequently created and business environments changed forever.
Big Data refers to the enormous volume, velocity, and variety of data that exists and has the potential to be turned into business value. The challenge of Big Data is taking inhuman amounts of data and distilling it into information that human brains can use. Most businesses accumulate astronomical amounts of data - and the volume is expanding at an alarming rate. According to IDC, the volume of digital content in the world will grow to 2.7 billion terabytes in 2012, up 48% from 2011, and will reach 8 billion terabytes by 2015. 
The data flood, of course, comes from both structured corporate databases and unstructured data from Web pages, blogs, social networking messages and other sources. For example, there are now countless digital sensors worldwide in industrial equipment, automobiles, electrical meters and shipping crates. They can measure and communicate location, movement, vibration, temperature, humidity, even chemical changes in the air. Companies wield data like a weapon. Retailers, like Wal-Mart and Kohl's, analyze sales, pricing and economic, demographic and weather data to tailor product selections at particular stores and determine the timing of price markdowns. Logistics companies like UPS mine data on truck delivery times and traffic patterns to fine-tune routing.
Today, a whole ecosystem of new businesses is springing up to engage with this new reality: companies that store data; companies that mine data for insight; and companies that aggregate data to make it manageable. But it's an ecosystem that's still emerging, and its exact shape has yet to make itself clear.
One of the biggest challenges of working with Big Data is assembling it and preparing it for analysis. Different systems store data in different formats, even within the same company. Assembling, standardizing, and cleaning data of irregularities - all without scrubbing it of the information that makes it valuable - is a central challenge of this space.
Of course, Hadoop, an open source software framework derived from Google's Map Reduce and Google File System (GFS) papers, is being leveraged by several technology vendors to do just that. Hadoop maps tasks across a cluster of machines, splitting them into smaller sub-tasks, before reducing the results into one master calculation. It's really an old grid computing technique given new life in the age of cloud computing.
Hadoop is converging with other technology advances such as high-speed data analysis made possible because of parallel computing, in-memory processing, and lower cost flash memory in the form of solid state drives. The prospect of being able to process troves of data very quickly, in memory, without time-consuming forays to retrieve information stored on disk drives, is a big advance that will enable companies to assemble, sort, and analyze data much more rapidly.
For example, T-Mobile is using SAP's HANA to mine data from stores, text messages and call centers on its 30 million U.S. customers to tailor personalized deals. What used to take a week can be done in three hours with the SAP system. Organizations that can leverage this capability to make faster and more informed business decisions will have a distinct advantage over competitors.
In a short period of time, Hadoop has transitioned from relative obscurity as a consumer Internet project into the mainstream consciousness of enterprise IT. Hadoop is designed to handle mountains of unstructured data. However, as it exists, the open source code is a long way from meeting enterprise requirements for security, management, and efficiency without some serious customization. Enterprise-scale Hadoop deployments require costly IT specialists who are capable of guiding a lot of somewhat disjointed processes. That currently limits adoption to organizations with substantial IT budgets.
It will take a refined platform to enable Hadoop and its derivatives to fit into the enterprise as a complement to existing data analytics and data warehousing tools from established business process vendors like Oracle, HP, and SAP. At Zettaset, for example, we are focused on making Hadoop much more accessible to enterprises of all sizes by creating a high availability platform that takes much of the complexity out of assembling and preparing huge amounts of data for analysis. We have aggregated multiple steps into a streamlined automated process, significantly enhanced security, and are now integrating our software into an appliance which can be racked in the data center and easily managed through a user-friendly GUI.
The true value of Big Data lies in the amount of useful data that can be derived from it. The future of Big Data is therefore to do for data and analytics what Moore's Law has done for computing hardware, and exponentially increase the speed and value of business intelligence. Whether it is linking geography and retail availability, using patient data to forecast public health trends, or analyzing global climate trends, we live in a world full of data. Effectively harnessing Big Data will give businesses a whole new lens through which to see it.
- Source: "IDC Predictions 2012: Competing for 2020," December 2011
In 2014, Amazon announced a new form of compute called Lambda. We didn't know it at the time, but this represented a fundamental shift in what we expect from cloud computing. Now, all of the major cloud computing vendors want to take part in this disruptive technology. In his session at 20th Cloud Expo, John Jelinek IV, a web developer at Linux Academy, will discuss why major players like AWS, Microsoft Azure, IBM Bluemix, and Google Cloud Platform are all trying to sidestep VMs and containers...
Jan. 17, 2017 11:00 PM EST Reads: 457
IoT is at the core or many Digital Transformation initiatives with the goal of re-inventing a company's business model. We all agree that collecting relevant IoT data will result in massive amounts of data needing to be stored. However, with the rapid development of IoT devices and ongoing business model transformation, we are not able to predict the volume and growth of IoT data. And with the lack of IoT history, traditional methods of IT and infrastructure planning based on the past do not app...
Jan. 17, 2017 10:30 PM EST Reads: 626
The many IoT deployments around the world are busy integrating smart devices and sensors into their enterprise IT infrastructures. Yet all of this technology – and there are an amazing number of choices – is of no use without the software to gather, communicate, and analyze the new data flows. Without software, there is no IT. In this power panel at @ThingsExpo, moderated by Conference Chair Roger Strukhoff, Dave McCarthy, Director of Products at Bsquare Corporation; Alan Williamson, Principal ...
Jan. 17, 2017 10:30 PM EST Reads: 2,312
DevOps and microservices are permeating software engineering teams broadly, whether these teams are in pure software shops but happen to run a business, such Uber and Airbnb, or in companies that rely heavily on software to run more traditional business, such as financial firms or high-end manufacturers. Microservices and DevOps have created software development and therefore business speed and agility benefits, but they have also created problems; specifically, they have created software securi...
Jan. 17, 2017 09:30 PM EST Reads: 1,728
WebRTC has had a real tough three or four years, and so have those working with it. Only a few short years ago, the development world were excited about WebRTC and proclaiming how awesome it was. You might have played with the technology a couple of years ago, only to find the extra infrastructure requirements were painful to implement and poorly documented. This probably left a bitter taste in your mouth, especially when things went wrong.
Jan. 17, 2017 09:15 PM EST Reads: 7,504
"There is a huge interest in Kubernetes. People are now starting to use Kubernetes and implement it," stated Sebastian Scheele, co-founder of Loodse, in this SYS-CON.tv interview at DevOps at 19th Cloud Expo, held November 1-3, 2016, at the Santa Clara Convention Center in Santa Clara, CA.
Jan. 17, 2017 08:45 PM EST Reads: 1,912
SYS-CON Media announced today that @WebRTCSummit Blog, the largest WebRTC resource in the world, has been launched. @WebRTCSummit Blog offers top articles, news stories, and blog posts from the world's well-known experts and guarantees better exposure for its authors than any other publication. @WebRTCSummit Blog can be bookmarked ▸ Here @WebRTCSummit conference site can be bookmarked ▸ Here
Jan. 17, 2017 08:00 PM EST Reads: 11,585
A critical component of any IoT project is what to do with all the data being generated. This data needs to be captured, processed, structured, and stored in a way to facilitate different kinds of queries. Traditional data warehouse and analytical systems are mature technologies that can be used to handle certain kinds of queries, but they are not always well suited to many problems, particularly when there is a need for real-time insights.
Jan. 17, 2017 06:45 PM EST Reads: 6,209
Web Real-Time Communication APIs have quickly revolutionized what browsers are capable of. In addition to video and audio streams, we can now bi-directionally send arbitrary data over WebRTC's PeerConnection Data Channels. With the advent of Progressive Web Apps and new hardware APIs such as WebBluetooh and WebUSB, we can finally enable users to stitch together the Internet of Things directly from their browsers while communicating privately and securely in a decentralized way.
Jan. 17, 2017 04:45 PM EST Reads: 3,049
In his session at DevOps Summit, Tapabrata Pal, Director of Enterprise Architecture at Capital One, will tell a story about how Capital One has embraced Agile and DevOps Security practices across the Enterprise – driven by Enterprise Architecture; bringing in Development, Operations and Information Security organizations together. Capital Ones DevOpsSec practice is based upon three "pillars" – Shift-Left, Automate Everything, Dashboard Everything. Within about three years, from 100% waterfall, C...
Jan. 17, 2017 04:30 PM EST Reads: 9,616
The Internet of Things can drive efficiency for airlines and airports. In their session at @ThingsExpo, Shyam Varan Nath, Principal Architect with GE, and Sudip Majumder, senior director of development at Oracle, discussed the technical details of the connected airline baggage and related social media solutions. These IoT applications will enhance travelers' journey experience and drive efficiency for the airlines and the airports.
Jan. 17, 2017 04:15 PM EST Reads: 1,978
"We're bringing out a new application monitoring system to the DevOps space. It manages large enterprise applications that are distributed throughout a node in many enterprises and we manage them as one collective," explained Kevin Barnes, President of eCube Systems, in this SYS-CON.tv interview at DevOps at 18th Cloud Expo, held June 7-9, 2016, at the Javits Center in New York City, NY.
Jan. 17, 2017 03:30 PM EST Reads: 5,325
Containers have changed the mind of IT in DevOps. They enable developers to work with dev, test, stage and production environments identically. Containers provide the right abstraction for microservices and many cloud platforms have integrated them into deployment pipelines. DevOps and containers together help companies achieve their business goals faster and more effectively. In his session at DevOps Summit, Ruslan Synytsky, CEO and Co-founder of Jelastic, reviewed the current landscape of Dev...
Jan. 17, 2017 02:45 PM EST Reads: 4,118
SYS-CON Events announced today that Catchpoint, a leading digital experience intelligence company, has been named “Silver Sponsor” of SYS-CON's 20th International Cloud Expo®, which will take place on June 6-8, 2017, at the Javits Center in New York City, NY. Catchpoint Systems is a leading Digital Performance Analytics company that provides unparalleled insight into your customer-critical services to help you consistently deliver an amazing customer experience. Designed for digital business, C...
Jan. 17, 2017 02:30 PM EST Reads: 1,741
@DevOpsSummit at Cloud taking place June 6-8, 2017, at Javits Center, New York City, is co-located with the 20th International Cloud Expo and will feature technical sessions from a rock star conference faculty and the leading industry players in the world. The widespread success of cloud computing is driving the DevOps revolution in enterprise IT. Now as never before, development teams must communicate and collaborate in a dynamic, 24/7/365 environment. There is no time to wait for long developm...
Jan. 17, 2017 02:15 PM EST Reads: 3,363
"We formed Formation several years ago to really address the need for bring complete modernization and software-defined storage to the more classic private cloud marketplace," stated Mark Lewis, Chairman and CEO of Formation Data Systems, in this SYS-CON.tv interview at 18th Cloud Expo, held June 7-9, 2016, at the Javits Center in New York City, NY.
Jan. 17, 2017 02:15 PM EST Reads: 6,323
With major technology companies and startups seriously embracing IoT strategies, now is the perfect time to attend @ThingsExpo 2016 in New York. Learn what is going on, contribute to the discussions, and ensure that your enterprise is as "IoT-Ready" as it can be! Internet of @ThingsExpo, taking place June 6-8, 2017, at the Javits Center in New York City, New York, is co-located with 20th Cloud Expo and will feature technical sessions from a rock star conference faculty and the leading industry p...
Jan. 17, 2017 02:15 PM EST Reads: 3,636
In his General Session at 17th Cloud Expo, Bruce Swann, Senior Product Marketing Manager for Adobe Campaign, explored the key ingredients of cross-channel marketing in a digital world. Learn how the Adobe Marketing Cloud can help marketers embrace opportunities for personalized, relevant and real-time customer engagement across offline (direct mail, point of sale, call center) and digital (email, website, SMS, mobile apps, social networks, connected objects).
Jan. 17, 2017 02:00 PM EST Reads: 5,378
Updating DevOps to the latest production data slows down your development cycle. Probably it is due to slow, inefficient conventional storage and associated copy data management practices. In his session at @DevOpsSummit at 20th Cloud Expo, Dhiraj Sehgal, in Product and Solution at Tintri, will talk about DevOps and cloud-focused storage to update hundreds of child VMs (different flavors) with updates from a master VM in minutes, saving hours or even days in each development cycle. He will also...
Jan. 17, 2017 02:00 PM EST Reads: 1,081
A look across the tech landscape at the disruptive technologies that are increasing in prominence and speculate as to which will be most impactful for communications – namely, AI and Cloud Computing. In his session at 20th Cloud Expo, Curtis Peterson, VP of Operations at RingCentral, will highlight the current challenges of these transformative technologies and share strategies for preparing your organization for these changes. This “view from the top” will outline the latest trends and developm...
Jan. 17, 2017 01:45 PM EST Reads: 927