Welcome!

@CloudExpo Authors: Yeshim Deniz, Liz McMillan, Pat Romanski, Zakia Bouachraoui, Elizabeth White

Related Topics: @CloudExpo

@CloudExpo: Blog Feed Post

Big Data Comes in One Size Only – Big

My recent talk on Big Data

I gave a talk titled Big Data – Trends and Challenges on Sept 27 in San Jose.

This was organized as a meet-up event by by Datapipe and Compassites Software. Datapipe provides cloud infrastructure services to clients whereas Compassites Software (where I am a board director) is a technology services firm out of Bangalore, India focusing on areas like consumeration of IT, cloud computing, and Big Data.

At the talk yesterday, I realized how confused people seem to be on Big Data, as the term is so ill-defined. One thing is for sure, Big Data comes in one size – Big. Besides the size issue (over petabytes), there is the velocity issue (Data in Motion vs. Data in Rest) and the variety issue. I mentioned that as the volume of data keeps rising, the percentage of data for analysis and insight keeps declining. I mentioned that 80% of the data in the world is unstructured, hence new solutions are being invented. Also, M2M (machine to machine) or sensor data keeps rising. In the volume context, I said that a single engine in a Boeing 747, spills out 10 Terabytes per hour. When you take all four engines on a Boeing 747 flying across the Atlantic, it produces a staggering 640TB. Now everyday there are 25000 flights across the Atlantic and you can do the math on how much data gets collected per day.

We discussed the business value of big data and how the typical pilot project at enterprises seems to be IT Log Data analysis. Other areas like fraud detection, social media, call center feedback are candidates for Big data application. On the technology front, much has been happening during last 5-7 years. All the innovations are coming out of the new web companies like Google, Amazon, Yahoo, Facebook, and Twitter. The Hadoop platform is an offshoot of Google’s early work on GFS (Google File System) and GMR (Google MapReduce). Google is moving beyond Hadoop via its recent work on Dremel, Percolator, and Pregel. Facebook is also putting many new projects like Puma, mostly for realtime access and analysis. Twitter’s Storm project is also noteworthy. Google has offered the BigQuery as a cloud service recently. Then there are dozens of NoSQL products such as Cassandra, Couchbase, MongoDB, Riak, etc.

It is important to remember that the world is not being taken over by Hadoop, as it is a batch system for handling very large data volumes via distributed parallel processing on commodity hardware. It does not touch the space of OLTP which is critical for airlines and banking industries. Also, if your data volume is under 100 Terabytes and it is structured data, then current offerings of Data Warehousing via a RDBMS or appliances (e.g. Oracle Exadata, IBM Netezza) are excellent solutions. The web-centric interactive world has given rise to the need of extreme scale and the Hadoop-based solutions must learn to co-exist with the existing world. Hence Big Data integration will be a key area.

One thing for sure. There is a lot of interest on this subject of Big Data, as clarity is one thing lacking amidst all the marketing hype and noise.

Read the original blog entry...

More Stories By Jnan Dash

Jnan Dash is Senior Advisor at EZShield Inc., Advisor at ScaleDB and Board Member at Compassites Software Solutions. He has lived in Silicon Valley since 1979. Formerly he was the Chief Strategy Officer (Consulting) at Curl Inc., before which he spent ten years at Oracle Corporation and was the Group Vice President, Systems Architecture and Technology till 2002. He was responsible for setting Oracle's core database and application server product directions and interacted with customers worldwide in translating future needs to product plans. Before that he spent 16 years at IBM. He blogs at http://jnandash.ulitzer.com.

Comments (0)

Share your thoughts on this story.

Add your comment
You must be signed in to add a comment. Sign-in | Register

In accordance with our Comment Policy, we encourage comments that are on topic, relevant and to-the-point. We will remove comments that include profanity, personal attacks, racial slurs, threats of violence, or other inappropriate material that violates our Terms and Conditions, and will block users who make repeated violations. We ask all readers to expect diversity of opinion and to treat one another with dignity and respect.


CloudEXPO Stories
Bill Schmarzo, author of "Big Data: Understanding How Data Powers Big Business" and "Big Data MBA: Driving Business Strategies with Data Science," is responsible for setting the strategy and defining the Big Data service offerings and capabilities for EMC Global Services Big Data Practice. As the CTO for the Big Data Practice, he is responsible for working with organizations to help them identify where and how to start their big data journeys. He's written several white papers, is an avid blogger and is a frequent speaker on the use of Big Data and data science to power the organization's key business initiatives. He is a University of San Francisco School of Management (SOM) Executive Fellow where he teaches the "Big Data MBA" course. Bill was ranked as #15 Big Data Influencer by Onalytica. Bill has over three decades of experience in data warehousing, BI and analytics. He authored E...
When talking IoT we often focus on the devices, the sensors, the hardware itself. The new smart appliances, the new smart or self-driving cars (which are amalgamations of many ‘things'). When we are looking at the world of IoT, we should take a step back, look at the big picture. What value are these devices providing. IoT is not about the devices, its about the data consumed and generated. The devices are tools, mechanisms, conduits. This paper discusses the considerations when dealing with the massive amount of information associated with these devices. Ed presented sought out sessions at CloudEXPO Silicon Valley 2017 and CloudEXPO New York 2017. He is a regular contributor to Cloud Computing Journal.
Wooed by the promise of faster innovation, lower TCO, and greater agility, businesses of every shape and size have embraced the cloud at every layer of the IT stack – from apps to file sharing to infrastructure. The typical organization currently uses more than a dozen sanctioned cloud apps and will shift more than half of all workloads to the cloud by 2018. Such cloud investments have delivered measurable benefits. But they’ve also resulted in some unintended side-effects: complexity and risk. End users now struggle to navigate multiple environments with varying degrees of performance. Companies are unclear on the security of their data and network access. And IT squads are overwhelmed trying to monitor and manage it all.
Sanjeev Sharma Joins November 11-13, 2018 @DevOpsSummit at @CloudEXPO New York Faculty. Sanjeev Sharma is an internationally known DevOps and Cloud Transformation thought leader, technology executive, and author. Sanjeev's industry experience includes tenures as CTO, Technical Sales leader, and Cloud Architect leader. As an IBM Distinguished Engineer, Sanjeev is recognized at the highest levels of IBM's core of technical leaders.
Business professionals no longer wonder if they'll migrate to the cloud; it's now a matter of when. The cloud environment has proved to be a major force in transitioning to an agile business model that enables quick decisions and fast implementation that solidify customer relationships. And when the cloud is combined with the power of cognitive computing, it drives innovation and transformation that achieves astounding competitive advantage.