Welcome!

@CloudExpo Authors: Elizabeth White, Yeshim Deniz, Zakia Bouachraoui, Pat Romanski, Liz McMillan

Related Topics: @CloudExpo

@CloudExpo: Article

Big Data – The State of Affairs

Big Data is here to stay, but do we have the tools to efficiently process it?

Many products are available as open source or proprietary products that can handle Big Data. Which one is best fit for this task?

Today's classic RDBMSs and tools are able to quickly load the data, process it and present results in an easy to understand format.  You can use SQL or programmatic interface to process the data randomly or in batch; RDBMS's keep data safe, protected against hardware and software failures.

Standards tools and products are not able to cope with Big Data requirement, which is not dissimilar to  what is involved in processing today's regular data sets, just on a much bigger scale. Mainstream companies like telcos, financials, web companies as well as government are reaching the limit of what  can be efficiently processed by classic RDBMS techhnologies.

When it comes to picking a proper platform and tools to handle your Big Data there are a couple of possible choices:

  • Oracle Exadata - it doesn't fit economical mandate; Exadata's weak link and bottleneck is its reliance on classic Oracle RDBMS
  • NoSQL databases -  too immature, they offer no SQL or similar random access query language ( you are presently forced to write  programs to access your data ); often achieve scale-out by not implementing all elements of ACID, CAP
  • Hadoop/MapReduce and related open source ecosystem ( Pig, Hive, HBase ) -  useful for cheap data storage on commodity hardware and batch processing; they offer no efficient, non-programmatic random access
  • proprietary MPP databases running on commodity hardware ( Vertica, Aster Data, Greenplum )  - very fast and can provide random, SQL  access to big data; their management features and general feature sets are immature
  • proprietary MPP databases running on specialized hardware ( Teradata ) - fairly expensive ( don't run on commodity hardware )
  • new platforms that will or are trying to emulate Google Percolator, Dremel  ( latest Google technologies dealing with big data ACID compliant transactions and reporting ), similarly to how Hadoop originated from  Google GFS and MapReduce.

We would say that there is no single, generic product or platform available today that can handle this task. Depending on your needs you have to deploy  and combinne quite a few of technologies to bring you closer to achieving end-to-end efficient, comprehensive processing of Big Data. You will quite likely have to custom build solutions that will fit your particular needs as off-the-shelf solutions are still immature, incomplete or not available.

Big Data is an area of growth and innovation, so current picture is bound to change as new products and technologies appear, bringing us closer to the ultimate goal of routine, efficient processing of Big Data.

More Stories By Ranko Mosic

Ranko Mosic, BScEng, is specializing in Big Data/Data Architecture consulting services ( database/data architecture, machine learning ). His clients are in finance, retail, telecommunications industries. Ranko is welcoming inquiries about his availability for consulting engagements and can be reached at 408-757-0053 or [email protected]

Comments (0)

Share your thoughts on this story.

Add your comment
You must be signed in to add a comment. Sign-in | Register

In accordance with our Comment Policy, we encourage comments that are on topic, relevant and to-the-point. We will remove comments that include profanity, personal attacks, racial slurs, threats of violence, or other inappropriate material that violates our Terms and Conditions, and will block users who make repeated violations. We ask all readers to expect diversity of opinion and to treat one another with dignity and respect.


CloudEXPO Stories
Founded in 2002 and headquartered in Chicago, Nexum® takes a comprehensive approach to security. Nexum approaches business with one simple statement: “Do what’s right for the customer and success will follow.” Nexum helps you mitigate risks, protect your data, increase business continuity and meet your unique business objectives by: Detecting and preventing network threats, intrusions and disruptions Equipping you with the information, tools, training and resources you need to effectively manage IT risk Nexum, Latin for an arrangement by which one pledged one’s very liberty as security, Nexum is committed to ensuring your security. At Nexum, We Mean Security®.
The Transparent Cloud-computing Consortium (T-Cloud) is a neutral organization for researching new computing models and business opportunities in IoT era. In his session, Ikuo Nakagawa, Co-Founder and Board Member at Transparent Cloud Computing Consortium, will introduce the big change toward the "connected-economy" in the digital age. He'll introduce and describe some leading-edge business cases from his original points of view, and discuss models & strategies in the connected-economy. Nowadays, "digital innovation" is a big wave of business transformation based on digital technologies. IoT, Big Data, AI, FinTech and various leading-edge technologies are key components of such business drivers.
Doug was appointed CEO of Big Switch in 2013 to lead the company on its mission to provide modern cloud and data center networking solutions capable of disrupting the stronghold by legacy vendors. Under his guidance, Big Switch has experienced 30+% average QoQ growth for the last 16 quarters; more than quadrupled headcount; successfully shifted to a software-only and subscription-based recurring revenue model; solidified key partnerships with Accton/Edgecore, Dell EMC, HPE, Nutanix, RedHat and VMware; developed Open Network Linux, an open source NOS foundational component designed in partnership with Facebook and Google; and he played an integral role in raising two-thirds of the company's $120MM of funding. Prior to Big Switch, Doug was SVP & GM of Juniper Networks $1BN business across Asia-Pacific, Japan and Greater China, and he began his time at Juniper as SVP & GM of its Security bu...
Digital transformation is about embracing digital technologies into a company's culture to better connect with its customers, automate processes, create better tools, enter new markets, etc. Such a transformation requires continuous orchestration across teams and an environment based on open collaboration and daily experiments. In his session at 21st Cloud Expo, Alex Casalboni, Technical (Cloud) Evangelist at Cloud Academy, explored and discussed the most urgent unsolved challenges to achieve full cloud literacy in the enterprise world.
Having been in the web hosting industry since 2002, dhosting has gained a great deal of experience while working on a wide range of projects. This experience has enabled the company to develop our amazing new product, which they are now excited to present! Among dHosting's greatest achievements, they can include the development of their own hosting panel, the building of their fully redundant server system, and the creation of dhHosting's unique product, Dynamic Edge.