Welcome!

@CloudExpo Authors: Yeshim Deniz, Zakia Bouachraoui, Liz McMillan, Pat Romanski, Carmen Gonzalez

Related Topics: @DXWorldExpo, Agile Computing, @ThingsExpo

@DXWorldExpo: Blog Feed Post

Data Lake Phenomenon | @ThingsExpo #IoT #M2M #BigData #Microservices

Is the Data Lake an effective catchment for all of the enterprise data?

Data Lake Phenomenon Among Enterprises

Over the past few years, there has been an explosion in the volume of data. To tackle this big data explosion, there has been a rise in the number of successful Hadoop projects in enterprises. Due to the large volumes of data, the emergence of Hadoop technology, and the need to store all soloed data in one place, has prompted a phenomenon among enterprises called: Data Lake.

Is the Data Lake an effective catchment for all of the enterprise data?

Yes and No. Data lakes are good to house the current, inter-related data but they don’t address the need for an enterprise-wide data management system

  • Since the data lake holds raw data of different types the business user cannot have controlled access to risk-free, secure, governed and curated data with semantic consistency as in the case of an enterprise data warehouse
  • Enterprise data today is heterogeneous, locked in disparate data sources and data from these systems are in conflict
  • A data lake is agnostic to the type of data it receives and due to issues such as lack of governance, descriptive metadata and a mechanism to maintain it, the data lake can easily turn into a data swamp with too much data
  • Hadoop and related technologies are still nascent even among early adopters, who are mostly conversant with SQL for data discovery and require training in Pig and MapReduce for data access. This slows down time-to-value for enterprises

Hortonworks has helped with the Data Lake phenomena. One example of this is the largest member-owned healthcare company in the US delivering industry leading supply chain management services and clinical improvement services to its members, VHA. The company had its product, supplier, and member information, and other data, spread across multiple sources, residing in silos. VHA used Hortonworks Data Platform to enable the business users to discover the related data and provide services to their members. Because of their previous success with data virtualization using the Denodo Platform, VHA decided to use data virtualization to enable their business users to discover data using the familiar SQL, and thus abstract their access directly to Hadoop.

Read more about Data Lake here.

Credit: .

More Stories By Bob Gourley

Bob Gourley writes on enterprise IT. He is a founder of Crucial Point and publisher of CTOvision.com

CloudEXPO Stories
Darktrace is the world's leading AI company for cyber security. Created by mathematicians from the University of Cambridge, Darktrace's Enterprise Immune System is the first non-consumer application of machine learning to work at scale, across all network types, from physical, virtualized, and cloud, through to IoT and industrial control systems. Installed as a self-configuring cyber defense platform, Darktrace continuously learns what is ‘normal' for all devices and users, updating its understanding as the environment changes.
Andrew Keys is Co-Founder of ConsenSys Enterprise. He comes to ConsenSys Enterprise with capital markets, technology and entrepreneurial experience. Previously, he worked for UBS investment bank in equities analysis. Later, he was responsible for the creation and distribution of life settlement products to hedge funds and investment banks. After, he co-founded a revenue cycle management company where he learned about Bitcoin and eventually Ethereum.
In today's always-on world, customer expectations have changed. Competitive differentiation is delivered through rapid software innovations, the ability to respond to issues quickly and by releasing high-quality code with minimal interruptions. DevOps isn't some far off goal; it's methodologies and practices are a response to this demand. The demand to go faster. The demand for more uptime. The demand to innovate. In this keynote, we will cover the Nutanix Developer Stack. Built from the foundation of software-defined infrastructure, Nutanix has rapidly expanded into full application lifecycle management across any infrastructure or cloud .Join us as we delve into how the Nutanix Developer Stack makes it easy to build hybrid cloud applications by weaving DBaaS, micro segmentation, event driven lifecycle operations, and both financial and cloud governance together into a single unified st...
Big Switch's mission is to disrupt the status quo of networking with order of magnitude improvements in network e ciency, intelligence and agility by delivering Next-Generation Data Center Networking. We enable data center transformation and accelerate business velocity by delivering a responsive, automated, and programmable software-dened networking (SDN) fabric-based networking solution. Traditionally, the network has been viewed as the barrier to data center transformation as legacy networking architectures hinder IT organizations with brittle, complex and cumbersome switch-by-switch management paradigms and in exible, proprietary hardware choices that are increasingly unable to keep up with the pace required of businesses today.
Lori MacVittie is a subject matter expert on emerging technology responsible for outbound evangelism across F5's entire product suite. MacVittie has extensive development and technical architecture experience in both high-tech and enterprise organizations, in addition to network and systems administration expertise. Prior to joining F5, MacVittie was an award-winning technology editor at Network Computing Magazine where she evaluated and tested application-focused technologies including app security and encryption-related solutions. She holds a B.S. in Information and Computing Science from the University of Wisconsin at Green Bay, and an M.S. in Computer Science from Nova Southeastern University, and is an O'Reilly author.