Welcome!

@CloudExpo Authors: Liz McMillan, Yeshim Deniz, Pat Romanski, Elizabeth White, Kevin Sparenberg

Related Topics: @BigDataExpo, Linux Containers, Open Source Cloud, Containers Expo Blog, Server Monitoring, @CloudExpo, Apache, FinTech Journal

@BigDataExpo: Article

Red Hat Unveils Big Data and Open Hybrid Cloud Direction

Is building a robust network of ecosystem and enterprise integration partners to deliver Big Data solutions

Red Hat on Wednesday announced its Big Data direction and solutions to satisfy enterprise requirements for highly reliable, scalable, and manageable solutions to effectively run their Big Data analytics workloads. In addition, Red Hat announced that the company will contribute its Red Hat Storage Hadoop plug-in to the Apache Hadoop open community to transform Red Hat Storage into a fully supported, Hadoop-compatible file system for Big Data environments, and that Red Hat is building a robust network of ecosystem and enterprise integration partners to deliver comprehensive Big Data solutions to enterprise customers.

Red Hat Big Data infrastructure and application platforms are suited for enterprises leveraging the open hybrid cloud environment. Red Hat is working with the open cloud community to support Big Data customers. Many enterprises worldwide use public cloud infrastructure, such as Amazon Web Services (AWS), for the development, proof-of-concept, and pre-production phases of their Big Data projects. The workloads are then moved to their private clouds to scale up the analytics with the larger data set. An open hybrid cloud environment enables enterprises to transfer workloads from the public cloud into their private cloud without the need to re-tool their applications. Red Hat is actively engaged in the open cloud community through projects like OpenStack and OpenShift Origin to help meet these enterprise Big Data expectations both today and in the future.

There are several Red Hat solutions available to effectively manage enterprise Big Data workloads. Focused on three primary areas, Red Hat's big data direction includes extending its product portfolio to deliver enhanced enterprise-class infrastructure solutions and application platforms, and partnering with leading big data analytics vendors and integrators.

Red Hat's Big Data Infrastructure Solutions

  • Red Hat Enterprise Linux - According to the Jan. 2012 The Linux Foundation Enterprise Linux User Report, the majority of Big Data implementations run on Linux and as the leading provider of commercial Linux1, Red Hat Enterprise Linux is a leading platform for Big Data deployments. Red Hat Enterprise Linux excels in distributed architectures and includes features that address critical big data needs. Managing tremendous data volumes and intensive analytic processing requires an infrastructure designed for high performance, reliability, fine-grained resource management, and scale-out storage. Red Hat Enterprise Linux addresses these challenges while adding the ability to develop, integrate, and secure big data applications reliably and scale easily to keep up with the pace that data is generated, analyzed, or transferred. This can be accomplished in the cloud, making it easier to store, aggregate, normalize, and integrate data from sources across multiple platforms, whether they are deployed as physical, virtual, or cloud-based resources.
  • Red Hat Storage - Built on the trusted Red Hat Enterprise Linux operating system and the proven GlusterFS distributed file system, Red Hat Storage Servers can be used to pool inexpensive commodity servers to provide a cost-effective, scalable, and reliable storage solution for Big Data.

Red Hat intends to make its Hadoop plug-in for Red Hat Storage available to the Hadoop community later this year. Currently in technology preview, the Red Hat Storage Apache Hadoop plug-in provides a new storage option for enterprise Hadoop deployments that delivers enterprise storage features while maintaining the API compatibility and local data access the Hadoop community expects. Red Hat Storage brings enterprise-class features to Big Data environments, such as Geo replication, High Availability, POSIX compliance, disaster recovery, and management, without compromising API compatibility and data locality. Customers now have a unified data and scale out storage software platform to accommodate files and objects deployed across physical, virtual, public and hybrid cloud resources.

  • Red Hat Enterprise Virtualization - Announced in Dec. 2012, Red Hat Enterprise Virtualization 3.1 is integrated with Red Hat Storage, enabling it to access the secure, shared storage pool managed by Red Hat Storage. This integration also offers enterprises reduced operational costs, expanded portability, choice of infrastructure, scalability, availability and the power of community-driven innovation with the contributions of the open source oVirt and Gluster projects. The combination of these platforms furthers Red Hat's open hybrid cloud vision of an integrated and converged Red Hat Storage and Red Hat Enterprise Virtualization node that serves both compute and storage resources.

Red Hat's Big Data Application and Integration Platforms

  • Red Hat JBoss Middleware - Red Hat JBoss Middleware provides enterprises with powerful technologies for creating and integrating big data-driven applications that are able to interact with new and emerging technologies like Hadoop or MongoDB. Big data is only valuable when businesses can extract information and respond intelligently. Red Hat JBoss Middleware solutions can populate large volumes and varieties of data quickly and reliably into Hadoop with high speed messaging technologies; simplify working with MongoDB through Hibernate OGM; process large volumes of data quickly and easily with Red Hat JBoss Data Grid; access Hadoop along with your traditional data sources with JBoss Enterprise Data Services Platform; and identify opportunities and threats through pattern recognition with JBoss Enterprise BRMS. Red Hat's middleware portfolio is well-suited to help enterprises seize the opportunities of big data.

Big Data Partnerships

  • Big Data Ecosystem Partners - To provide a comprehensive big data solution set to enterprises, Red Hat plans to partner with leading big data software and hardware providers to offer interoperability. Development of certified and documented reference architectures are expected to allow users to integrate and install comprehension enterprise big data solutions.
  • Enterprise Partners - Red Hat anticipates enabling the delivery of a comprehensive big data solution to its customers through leading enterprise integration partners utilizing the reference architectures developed by Red Hat and its big data ecosystem partners.

More Stories By Pat Romanski

News Desk compiles and publishes breaking news stories, press releases and latest news articles as they happen.

@CloudExpo Stories
DX World EXPO, LLC., a Lighthouse Point, Florida-based startup trade show producer and the creator of "DXWorldEXPO® - Digital Transformation Conference & Expo" has announced its executive management team. The team is headed by Levent Selamoglu, who has been named CEO. "Now is the time for a truly global DX event, to bring together the leading minds from the technology world in a conversation about Digital Transformation," he said in making the announcement.
"We are focused on SAP running in the clouds, to make this super easy because we believe in the tremendous value of those powerful worlds - SAP and the cloud," explained Frank Stienhans, CTO of Ocean9, Inc., in this SYS-CON.tv interview at 20th Cloud Expo, held June 6-8, 2017, at the Javits Center in New York City, NY.
"Peak 10 is a hybrid infrastructure provider across the nation. We are in the thick of things when it comes to hybrid IT," explained , Chief Technology Officer at Peak 10, in this SYS-CON.tv interview at 20th Cloud Expo, held June 6-8, 2017, at the Javits Center in New York City, NY.
"DX encompasses the continuing technology revolution, and is addressing society's most important issues throughout the entire $78 trillion 21st-century global economy," said Roger Strukhoff, Conference Chair. "DX World Expo has organized these issues along 10 tracks with more than 150 of the world's top speakers coming to Istanbul to help change the world."
"I think DevOps is now a rambunctious teenager – it’s starting to get a mind of its own, wanting to get its own things but it still needs some adult supervision," explained Thomas Hooker, VP of marketing at CollabNet, in this SYS-CON.tv interview at DevOps Summit at 20th Cloud Expo, held June 6-8, 2017, at the Javits Center in New York City, NY.
"We are still a relatively small software house and we are focusing on certain industries like FinTech, med tech, energy and utilities. We help our customers with their digital transformation," noted Piotr Stawinski, Founder and CEO of EARP Integration, in this SYS-CON.tv interview at 20th Cloud Expo, held June 6-8, 2017, at the Javits Center in New York City, NY.
"We've been engaging with a lot of customers including Panasonic, we've been involved with Cisco and now we're working with the U.S. government - the Department of Homeland Security," explained Peter Jung, Chief Product Officer at Pulzze Systems, in this SYS-CON.tv interview at @ThingsExpo, held June 6-8, 2017, at the Javits Center in New York City, NY.
"We're here to tell the world about our cloud-scale infrastructure that we have at Juniper combined with the world-class security that we put into the cloud," explained Lisa Guess, VP of Systems Engineering at Juniper Networks, in this SYS-CON.tv interview at 20th Cloud Expo, held June 6-8, 2017, at the Javits Center in New York City, NY.
"I will be talking about ChatOps and ChatOps as a way to solve some problems in the DevOps space," explained Himanshu Chhetri, CTO of Addteq, in this SYS-CON.tv interview at @DevOpsSummit at 20th Cloud Expo, held June 6-8, 2017, at the Javits Center in New York City, NY.
Your homes and cars can be automated and self-serviced. Why can't your storage? From simply asking questions to analyze and troubleshoot your infrastructure, to provisioning storage with snapshots, recovery and replication, your wildest sci-fi dream has come true. In his session at @DevOpsSummit at 20th Cloud Expo, Dan Florea, Director of Product Management at Tintri, provided a ChatOps demo where you can talk to your storage and manage it from anywhere, through Slack and similar services with...
The financial services market is one of the most data-driven industries in the world, yet it’s bogged down by legacy CPU technologies that simply can’t keep up with the task of querying and visualizing billions of records. In his session at 20th Cloud Expo, Karthik Lalithraj, a Principal Solutions Architect at Kinetica, discussed how the advent of advanced in-database analytics on the GPU makes it possible to run sophisticated data science workloads on the same database that is housing the rich...
SYS-CON Events announced today that Massive Networks will exhibit at SYS-CON's 21st International Cloud Expo®, which will take place on Oct 31 – Nov 2, 2017, at the Santa Clara Convention Center in Santa Clara, CA. Massive Networks mission is simple. To help your business operate seamlessly with fast, reliable, and secure internet and network solutions. Improve your customer's experience with outstanding connections to your cloud.
"We are an IT services solution provider and we sell software to support those solutions. Our focus and key areas are around security, enterprise monitoring, and continuous delivery optimization," noted John Balsavage, President of A&I Solutions, in this SYS-CON.tv interview at 20th Cloud Expo, held June 6-8, 2017, at the Javits Center in New York City, NY.
Everything run by electricity will eventually be connected to the Internet. Get ahead of the Internet of Things revolution and join Akvelon expert and IoT industry leader, Sergey Grebnov, in his session at @ThingsExpo, for an educational dive into the world of managing your home, workplace and all the devices they contain with the power of machine-based AI and intelligent Bot services for a completely streamlined experience.
"We want to show that our solution is far less expensive with a much better total cost of ownership so we announced several key features. One is called geo-distributed erasure coding, another is support for KVM and we introduced a new capability called Multi-Part," explained Tim Desai, Senior Product Marketing Manager at Hitachi Data Systems, in this SYS-CON.tv interview at 20th Cloud Expo, held June 6-8, 2017, at the Javits Center in New York City, NY.
DevOps at Cloud Expo, taking place October 31 - November 2, 2017, at the Santa Clara Convention Center in Santa Clara, CA, is co-located with 21st Cloud Expo and will feature technical sessions from a rock star conference faculty and the leading industry players in the world. The widespread success of cloud computing is driving the DevOps revolution in enterprise IT. Now as never before, development teams must communicate and collaborate in a dynamic, 24/7/365 environment. There is no time to w...
There is a huge demand for responsive, real-time mobile and web experiences, but current architectural patterns do not easily accommodate applications that respond to events in real time. Common solutions using message queues or HTTP long-polling quickly lead to resiliency, scalability and development velocity challenges. In his session at 21st Cloud Expo, Ryland Degnan, a Senior Software Engineer on the Netflix Edge Platform team, will discuss how by leveraging a reactive stream-based protocol,...
Internet of @ThingsExpo, taking place October 31 - November 2, 2017, at the Santa Clara Convention Center in Santa Clara, CA, is co-located with 21st Cloud Expo and will feature technical sessions from a rock star conference faculty and the leading industry players in the world. The Internet of Things (IoT) is the most profound change in personal and enterprise IT since the creation of the Worldwide Web more than 20 years ago. All major researchers estimate there will be tens of billions devic...
"The Striim platform is a full end-to-end streaming integration and analytics platform that is middleware that covers a lot of different use cases," explained Steve Wilkes, Founder and CTO at Striim, in this SYS-CON.tv interview at 20th Cloud Expo, held June 6-8, 2017, at the Javits Center in New York City, NY.
SYS-CON Events announced today that Calligo, an innovative cloud service provider offering mid-sized companies the highest levels of data privacy and security, has been named "Bronze Sponsor" of SYS-CON's 21st International Cloud Expo ®, which will take place on Oct 31 - Nov 2, 2017, at the Santa Clara Convention Center in Santa Clara, CA. Calligo offers unparalleled application performance guarantees, commercial flexibility and a personalised support service from its globally located cloud plat...