|By Jim Vogt||
|May 11, 2012 10:00 AM EDT||
About once every five years or so, the technology industry blazes a new path of innovation. The PC, the Internet, smart mobility and social networking have emerged over the past 20 plus years, delivering new technologies and business ecosystems that have fundamentally changed the world. The latest catalyst is Big Data.
Nearly every major new computing era in the past has had a hot IPO provide a catalyst for more widespread adoption of the shift. The recent Splunk IPO evokes parallels with Netscape, the company that provided the catalyst in 1995 to a wave of Internet computing for both B2C and B2B marketplaces. It ushered in a wave of new innovation and a plethora of new .com businesses. Hundreds of billions of dollars in new value was subsequently created and business environments changed forever.
Big Data refers to the enormous volume, velocity, and variety of data that exists and has the potential to be turned into business value. The challenge of Big Data is taking inhuman amounts of data and distilling it into information that human brains can use. Most businesses accumulate astronomical amounts of data - and the volume is expanding at an alarming rate. According to IDC, the volume of digital content in the world will grow to 2.7 billion terabytes in 2012, up 48% from 2011, and will reach 8 billion terabytes by 2015. 
The data flood, of course, comes from both structured corporate databases and unstructured data from Web pages, blogs, social networking messages and other sources. For example, there are now countless digital sensors worldwide in industrial equipment, automobiles, electrical meters and shipping crates. They can measure and communicate location, movement, vibration, temperature, humidity, even chemical changes in the air. Companies wield data like a weapon. Retailers, like Wal-Mart and Kohl's, analyze sales, pricing and economic, demographic and weather data to tailor product selections at particular stores and determine the timing of price markdowns. Logistics companies like UPS mine data on truck delivery times and traffic patterns to fine-tune routing.
Today, a whole ecosystem of new businesses is springing up to engage with this new reality: companies that store data; companies that mine data for insight; and companies that aggregate data to make it manageable. But it's an ecosystem that's still emerging, and its exact shape has yet to make itself clear.
One of the biggest challenges of working with Big Data is assembling it and preparing it for analysis. Different systems store data in different formats, even within the same company. Assembling, standardizing, and cleaning data of irregularities - all without scrubbing it of the information that makes it valuable - is a central challenge of this space.
Of course, Hadoop, an open source software framework derived from Google's Map Reduce and Google File System (GFS) papers, is being leveraged by several technology vendors to do just that. Hadoop maps tasks across a cluster of machines, splitting them into smaller sub-tasks, before reducing the results into one master calculation. It's really an old grid computing technique given new life in the age of cloud computing.
Hadoop is converging with other technology advances such as high-speed data analysis made possible because of parallel computing, in-memory processing, and lower cost flash memory in the form of solid state drives. The prospect of being able to process troves of data very quickly, in memory, without time-consuming forays to retrieve information stored on disk drives, is a big advance that will enable companies to assemble, sort, and analyze data much more rapidly.
For example, T-Mobile is using SAP's HANA to mine data from stores, text messages and call centers on its 30 million U.S. customers to tailor personalized deals. What used to take a week can be done in three hours with the SAP system. Organizations that can leverage this capability to make faster and more informed business decisions will have a distinct advantage over competitors.
In a short period of time, Hadoop has transitioned from relative obscurity as a consumer Internet project into the mainstream consciousness of enterprise IT. Hadoop is designed to handle mountains of unstructured data. However, as it exists, the open source code is a long way from meeting enterprise requirements for security, management, and efficiency without some serious customization. Enterprise-scale Hadoop deployments require costly IT specialists who are capable of guiding a lot of somewhat disjointed processes. That currently limits adoption to organizations with substantial IT budgets.
It will take a refined platform to enable Hadoop and its derivatives to fit into the enterprise as a complement to existing data analytics and data warehousing tools from established business process vendors like Oracle, HP, and SAP. At Zettaset, for example, we are focused on making Hadoop much more accessible to enterprises of all sizes by creating a high availability platform that takes much of the complexity out of assembling and preparing huge amounts of data for analysis. We have aggregated multiple steps into a streamlined automated process, significantly enhanced security, and are now integrating our software into an appliance which can be racked in the data center and easily managed through a user-friendly GUI.
The true value of Big Data lies in the amount of useful data that can be derived from it. The future of Big Data is therefore to do for data and analytics what Moore's Law has done for computing hardware, and exponentially increase the speed and value of business intelligence. Whether it is linking geography and retail availability, using patient data to forecast public health trends, or analyzing global climate trends, we live in a world full of data. Effectively harnessing Big Data will give businesses a whole new lens through which to see it.
- Source: "IDC Predictions 2012: Competing for 2020," December 2011
With 15% of enterprises adopting a hybrid IT strategy, you need to set a plan to integrate hybrid cloud throughout your infrastructure. In his session at 18th Cloud Expo, Steven Dreher, Director of Solutions Architecture at Green House Data, discussed how to plan for shifting resource requirements, overcome challenges, and implement hybrid IT alongside your existing data center assets. Highlights included anticipating workload, cost and resource calculations, integrating services on both sides...
Jul. 26, 2016 09:00 PM EDT Reads: 2,017
In his session at @DevOpsSummit at 19th Cloud Expo, Yoseph Reuveni, Director of Software Engineering at Jet.com, will discuss Jet.com's journey into containerizing Microsoft-based technologies like C# and F# into Docker. He will talk about lessons learned and challenges faced, the Mono framework tryout and how they deployed everything into Azure cloud. Yoseph Reuveni is a technology leader with unique experience developing and running high throughput (over 1M tps) distributed systems with extre...
Jul. 26, 2016 08:30 PM EDT Reads: 2,131
Manufacturers are embracing the Industrial Internet the same way consumers are leveraging Fitbits – to improve overall health and wellness. Both can provide consistent measurement, visibility, and suggest performance improvements customized to help reach goals. Fitbit users can view real-time data and make adjustments to increase their activity. In his session at @ThingsExpo, Mark Bernardo Professional Services Leader, Americas, at GE Digital, discussed how leveraging the Industrial Internet a...
Jul. 26, 2016 08:00 PM EDT Reads: 280
Big Data engines are powering a lot of service businesses right now. Data is collected from users from wearable technologies, web behaviors, purchase behavior as well as several arbitrary data points we’d never think of. The demand for faster and bigger engines to crunch and serve up the data to services is growing exponentially. You see a LOT of correlation between “Cloud” and “Big Data” but on Big Data and “Hybrid,” where hybrid hosting is the sanest approach to the Big Data Infrastructure pro...
Jul. 26, 2016 07:15 PM EDT Reads: 1,930
In his session at 18th Cloud Expo, Sagi Brody, Chief Technology Officer at Webair Internet Development Inc., and Logan Best, Infrastructure & Network Engineer at Webair, focused on real world deployments of DDoS mitigation strategies in every layer of the network. He gave an overview of methods to prevent these attacks and best practices on how to provide protection in complex cloud platforms. He also outlined what we have found in our experience managing and running thousands of Linux and Unix ...
Jul. 26, 2016 07:00 PM EDT Reads: 1,796
"My role is working with customers, helping them go through this digital transformation. I spend a lot of time talking to banks, big industries, manufacturers working through how they are integrating and transforming their IT platforms and moving them forward," explained William Morrish, General Manager Product Sales at Interoute, in this SYS-CON.tv interview at 18th Cloud Expo, held June 7-9, 2016, at the Javits Center in New York City, NY.
Jul. 26, 2016 06:30 PM EDT Reads: 2,116
[guide] Cloud Analytics for Dummies | @CloudExpo @Informatica #API #Cloud #Analytics #BusinessIntelligence
Cloud analytics is dramatically altering business intelligence. Some businesses will capitalize on these promising new technologies and gain key insights that’ll help them gain competitive advantage. And others won’t. Whether you’re a business leader, an IT manager, or an analyst, we want to help you and the people you need to influence with a free copy of “Cloud Analytics for Dummies,” the essential guide to this explosive new space for business intelligence.
Jul. 26, 2016 06:30 PM EDT Reads: 789
A critical component of any IoT project is what to do with all the data being generated. This data needs to be captured, processed, structured, and stored in a way to facilitate different kinds of queries. Traditional data warehouse and analytical systems are mature technologies that can be used to handle certain kinds of queries, but they are not always well suited to many problems, particularly when there is a need for real-time insights.
Jul. 26, 2016 05:45 PM EDT Reads: 1,823
Choosing the right cloud for your workloads is a balancing act that can cost your organization time, money and aggravation - unless you get it right the first time. Economics, speed, performance, accessibility, administrative needs and security all play a vital role in dictating your approach to the cloud. Without knowing the right questions to ask, you could wind up paying for capacity you'll never need or underestimating the resources required to run your applications.
Jul. 26, 2016 05:30 PM EDT Reads: 401
"Software-defined storage is a big problem in this industry because so many people have different definitions as they see fit to use it," stated Peter McCallum, VP of Datacenter Solutions at FalconStor Software, in this SYS-CON.tv interview at 18th Cloud Expo, held June 7-9, 2016, at the Javits Center in New York City, NY.
Jul. 26, 2016 05:00 PM EDT Reads: 1,475
Enterprise networks are complex. Moreover, they were designed and deployed to meet a specific set of business requirements at a specific point in time. But, the adoption of cloud services, new business applications and intensifying security policies, among other factors, require IT organizations to continuously deploy configuration changes. Therefore, enterprises are looking for better ways to automate the management of their networks while still leveraging existing capabilities, optimizing perf...
Jul. 26, 2016 05:00 PM EDT Reads: 1,171
The best-practices for building IoT applications with Go Code that attendees can use to build their own IoT applications. In his session at @ThingsExpo, Indraneel Mitra, Senior Solutions Architect & Technology Evangelist at Cognizant, provided valuable information and resources for both novice and experienced developers on how to get started with IoT and Golang in a day. He also provided information on how to use Intel Arduino Kit, Go Robotics API and AWS IoT stack to build an application tha...
Jul. 26, 2016 04:30 PM EDT Reads: 1,082
IoT generates lots of temporal data. But how do you unlock its value? You need to discover patterns that are repeatable in vast quantities of data, understand their meaning, and implement scalable monitoring across multiple data streams in order to monetize the discoveries and insights. Motif discovery and deep learning platforms are emerging to visualize sensor data, to search for patterns and to build application that can monitor real time streams efficiently. In his session at @ThingsExpo, ...
Jul. 26, 2016 04:30 PM EDT Reads: 1,013
You think you know what’s in your data. But do you? Most organizations are now aware of the business intelligence represented by their data. Data science stands to take this to a level you never thought of – literally. The techniques of data science, when used with the capabilities of Big Data technologies, can make connections you had not yet imagined, helping you discover new insights and ask new questions of your data. In his session at @ThingsExpo, Sarbjit Sarkaria, data science team lead ...
Jul. 26, 2016 04:00 PM EDT Reads: 1,031
Extracting business value from Internet of Things (IoT) data doesn’t happen overnight. There are several requirements that must be satisfied, including IoT device enablement, data analysis, real-time detection of complex events and automated orchestration of actions. Unfortunately, too many companies fall short in achieving their business goals by implementing incomplete solutions or not focusing on tangible use cases. In his general session at @ThingsExpo, Dave McCarthy, Director of Products...
Jul. 26, 2016 03:45 PM EDT Reads: 1,717
Amazon has gradually rolled out parts of its IoT offerings in the last year, but these are just the tip of the iceberg. In addition to optimizing their back-end AWS offerings, Amazon is laying the ground work to be a major force in IoT – especially in the connected home and office. Amazon is extending its reach by building on its dominant Cloud IoT platform, its Dash Button strategy, recently announced Replenishment Services, the Echo/Alexa voice recognition control platform, the 6-7 strategic...
Jul. 26, 2016 03:15 PM EDT Reads: 360
Aspose.Total for .NET is the most complete package of all file format APIs for .NET as offered by Aspose. It empowers developers to create, edit, render, print and convert between a wide range of popular document formats within any .NET, C#, ASP.NET and VB.NET applications. Aspose compiles all .NET APIs on a daily basis to ensure that it contains the most up to date versions of each of Aspose .NET APIs. If a new .NET API or a new version of existing APIs is released during the subscription peri...
Jul. 26, 2016 03:00 PM EDT Reads: 847
Verizon Communications Inc. (NYSE, Nasdaq: VZ) and Yahoo! Inc. (Nasdaq: YHOO) have entered into a definitive agreement under which Verizon will acquire Yahoo's operating business for approximately $4.83 billion in cash, subject to customary closing adjustments. Yahoo informs, connects and entertains a global audience of more than 1 billion monthly active users** -- including 600 million monthly active mobile users*** through its search, communications and digital content products. Yahoo also co...
Jul. 26, 2016 02:30 PM EDT Reads: 528
Ixia (Nasdaq: XXIA) has announced that NoviFlow Inc.has deployed IxNetwork® to validate the company’s designs and accelerate the delivery of its proven, reliable products. Based in Montréal, NoviFlow Inc. supports network carriers, hyperscale data center operators, and enterprises seeking greater network control and flexibility, network scalability, and the capacity to handle extremely large numbers of flows, while maintaining maximum network performance. To meet these requirements, NoviFlow in...
Jul. 26, 2016 02:00 PM EDT Reads: 553
As companies gain momentum, the need to maintain high quality products can outstrip their development team’s bandwidth for QA. Building out a large QA team (whether in-house or outsourced) can slow down development and significantly increases costs. This eBook takes QA profiles from 5 companies who successfully scaled up production without building a large QA team and includes: What to consider when choosing CI/CD tools How culture and communication can make or break implementation
Jul. 26, 2016 02:00 PM EDT Reads: 1,638