Welcome!

@CloudExpo Authors: Zakia Bouachraoui, Elizabeth White, Liz McMillan, Pat Romanski, Yeshim Deniz

Related Topics: @CloudExpo, Open Source Cloud

@CloudExpo: Blog Feed Post

Big Data & Analytics – What's New?

Three vendors are worth mentioning here in the Hadoop solution space

A friend of mine from my IBM days (an expert in Data Warehousing, BI, etc.) told me about the Hadoop conference he attended in San Jose few weeks back. When he attended the same conference two years ago in New York, there were hardly 200 attendees whereas this time, the number exceeded 2000 and it was a sold out event. This just proves how fast Hadoop has generated interest. He said that one theme in every presentation was the need for Hadoop skills as almost every presentation had a slide, “we are hiring”.

Hadoop offers a massively scalable data management and analysis environment that can handle many different data types without the complicated transformation and schema changes required to load diverse data into a conventional RDBMS. Remember the days of ETL (Extraction, Transformation, Loading) when data massaging and cleansing preceded the creation of the Data Warehouse for analytics purpose. Given the growth in data volume, velocity and variety, the era of “Big Data” has started and new tools such as Hadoop is the need of the hour for doing search and analytics.

Three vendors are worth mentioning here in the Hadoop solution space.

- Cloudera is the market share leader and it offers the open source Apache Hadoop software (CDH4) in its fourth generation and its proprietary system management software. The new version of CDH offers high availability, improved security and hot failover for the NameNode (metadata server) of the HDFS (file system). This node has been known as single point of failure (not good for enterprise needs).

- Hortonworks, which spun out of Yahoo last year has released its first product Hortonwork Data Platform. It uses Hadoop 1.0 code base (more stable) reassuring the enterprise users. It provides the high availability and failover needs with VMware virtualization and uses open source software for management console and also for ETL (Talend software).

- The third player is MapR which pitches its Hadoop distribution as a high-performance alternative replacing HDFS with a derivative of the Unix-based network file system that is highly scalable and has high availability features. MapR  also is part of the Amazon’s Elastic MapReduce service.

Hadoop scales in linear fashion to solve the data-volume challenge and runs on commodity hardware (less expensive). It has challenges in terms of skill shortage and batch-related delays. Many IT shops want to integrate old-school BI systems that are integrated with Hadoop to analyze data inside a cluster or result sets moved out of Hadoop. New Analytics vendors are popping up. Two start-ups are worth mentioning – Datameer and Karmasphere.

Datameer’s analytics platform provides modules for data integration to sources from mainframe to Twitter. It provides a spread-sheet driven data analysis environment meant for business analysts without IT skills. Karmasphere also provides reporting, analysis, and data visualization on Hadoop. It uses a graphical interface and collaborative workflow that works with Hive, the data warehousing component of Hadoop.

Hadoop integration with current BI environment will be a critical need, as years of investment in BI and analytics will not be thrown away to accommodate the new analytic tools.

More Stories By Jnan Dash

Jnan Dash is Senior Advisor at EZShield Inc., Advisor at ScaleDB and Board Member at Compassites Software Solutions. He has lived in Silicon Valley since 1979. Formerly he was the Chief Strategy Officer (Consulting) at Curl Inc., before which he spent ten years at Oracle Corporation and was the Group Vice President, Systems Architecture and Technology till 2002. He was responsible for setting Oracle's core database and application server product directions and interacted with customers worldwide in translating future needs to product plans. Before that he spent 16 years at IBM. He blogs at http://jnandash.ulitzer.com.

Comments (0)

Share your thoughts on this story.

Add your comment
You must be signed in to add a comment. Sign-in | Register

In accordance with our Comment Policy, we encourage comments that are on topic, relevant and to-the-point. We will remove comments that include profanity, personal attacks, racial slurs, threats of violence, or other inappropriate material that violates our Terms and Conditions, and will block users who make repeated violations. We ask all readers to expect diversity of opinion and to treat one another with dignity and respect.


CloudEXPO Stories
When applications are hosted on servers, they produce immense quantities of logging data. Quality engineers should verify that apps are producing log data that is existent, correct, consumable, and complete. Otherwise, apps in production are not easily monitored, have issues that are difficult to detect, and cannot be corrected quickly. Tom Chavez presents the four steps that quality engineers should include in every test plan for apps that produce log output or other machine data. Learn the steps so your team's apps not only function but also can be monitored and understood from their machine data when running in production.
Adding public cloud resources to an existing application can be a daunting process. The tools that you currently use to manage the software and hardware outside the cloud aren’t always the best tools to efficiently grow into the cloud. All of the major configuration management tools have cloud orchestration plugins that can be leveraged, but there are also cloud-native tools that can dramatically improve the efficiency of managing your application lifecycle. In his session at 18th Cloud Expo, Alex Lovell-Troy, Director of Solutions Engineering at Pythian, presented a roadmap that can be leveraged by any organization to plan, analyze, evaluate, and execute on moving from configuration management tools to cloud orchestration tools. He also addressed the three major cloud vendors as well as some tools that will work with any cloud.
DXWordEXPO New York 2018, colocated with CloudEXPO New York 2018 will be held November 11-13, 2018, in New York City and will bring together Cloud Computing, FinTech and Blockchain, Digital Transformation, Big Data, Internet of Things, DevOps, AI, Machine Learning and WebRTC to one location.
With more than 30 Kubernetes solutions in the marketplace, it's tempting to think Kubernetes and the vendor ecosystem has solved the problem of operationalizing containers at scale or of automatically managing the elasticity of the underlying infrastructure that these solutions need to be truly scalable. Far from it. There are at least six major pain points that companies experience when they try to deploy and run Kubernetes in their complex environments. In this presentation, the speaker will detail these pain points and explain how cloud can address them.
Discussions of cloud computing have evolved in recent years from a focus on specific types of cloud, to a world of hybrid cloud, and to a world dominated by the APIs that make today's multi-cloud environments and hybrid clouds possible. In this Power Panel at 17th Cloud Expo, moderated by Conference Chair Roger Strukhoff, panelists addressed the importance of customers being able to use the specific technologies they need, through environments and ecosystems that expose their APIs to make true change and transformation possible.