Welcome!

@CloudExpo Authors: Liz McMillan, Jeev Trika, Kevin Benedict, Elizabeth White, Philippe Abdoulaye

Related Topics: @BigDataExpo, Linux Containers, Open Source Cloud, Containers Expo Blog, Server Monitoring, @CloudExpo, Apache, OpenStack Journal

@BigDataExpo: Article

Red Hat Unveils Big Data and Open Hybrid Cloud Direction

Is building a robust network of ecosystem and enterprise integration partners to deliver Big Data solutions

Red Hat on Wednesday announced its Big Data direction and solutions to satisfy enterprise requirements for highly reliable, scalable, and manageable solutions to effectively run their Big Data analytics workloads. In addition, Red Hat announced that the company will contribute its Red Hat Storage Hadoop plug-in to the Apache Hadoop open community to transform Red Hat Storage into a fully supported, Hadoop-compatible file system for Big Data environments, and that Red Hat is building a robust network of ecosystem and enterprise integration partners to deliver comprehensive Big Data solutions to enterprise customers.

Red Hat Big Data infrastructure and application platforms are suited for enterprises leveraging the open hybrid cloud environment. Red Hat is working with the open cloud community to support Big Data customers. Many enterprises worldwide use public cloud infrastructure, such as Amazon Web Services (AWS), for the development, proof-of-concept, and pre-production phases of their Big Data projects. The workloads are then moved to their private clouds to scale up the analytics with the larger data set. An open hybrid cloud environment enables enterprises to transfer workloads from the public cloud into their private cloud without the need to re-tool their applications. Red Hat is actively engaged in the open cloud community through projects like OpenStack and OpenShift Origin to help meet these enterprise Big Data expectations both today and in the future.

There are several Red Hat solutions available to effectively manage enterprise Big Data workloads. Focused on three primary areas, Red Hat's big data direction includes extending its product portfolio to deliver enhanced enterprise-class infrastructure solutions and application platforms, and partnering with leading big data analytics vendors and integrators.

Red Hat's Big Data Infrastructure Solutions

  • Red Hat Enterprise Linux - According to the Jan. 2012 The Linux Foundation Enterprise Linux User Report, the majority of Big Data implementations run on Linux and as the leading provider of commercial Linux1, Red Hat Enterprise Linux is a leading platform for Big Data deployments. Red Hat Enterprise Linux excels in distributed architectures and includes features that address critical big data needs. Managing tremendous data volumes and intensive analytic processing requires an infrastructure designed for high performance, reliability, fine-grained resource management, and scale-out storage. Red Hat Enterprise Linux addresses these challenges while adding the ability to develop, integrate, and secure big data applications reliably and scale easily to keep up with the pace that data is generated, analyzed, or transferred. This can be accomplished in the cloud, making it easier to store, aggregate, normalize, and integrate data from sources across multiple platforms, whether they are deployed as physical, virtual, or cloud-based resources.
  • Red Hat Storage - Built on the trusted Red Hat Enterprise Linux operating system and the proven GlusterFS distributed file system, Red Hat Storage Servers can be used to pool inexpensive commodity servers to provide a cost-effective, scalable, and reliable storage solution for Big Data.

Red Hat intends to make its Hadoop plug-in for Red Hat Storage available to the Hadoop community later this year. Currently in technology preview, the Red Hat Storage Apache Hadoop plug-in provides a new storage option for enterprise Hadoop deployments that delivers enterprise storage features while maintaining the API compatibility and local data access the Hadoop community expects. Red Hat Storage brings enterprise-class features to Big Data environments, such as Geo replication, High Availability, POSIX compliance, disaster recovery, and management, without compromising API compatibility and data locality. Customers now have a unified data and scale out storage software platform to accommodate files and objects deployed across physical, virtual, public and hybrid cloud resources.

  • Red Hat Enterprise Virtualization - Announced in Dec. 2012, Red Hat Enterprise Virtualization 3.1 is integrated with Red Hat Storage, enabling it to access the secure, shared storage pool managed by Red Hat Storage. This integration also offers enterprises reduced operational costs, expanded portability, choice of infrastructure, scalability, availability and the power of community-driven innovation with the contributions of the open source oVirt and Gluster projects. The combination of these platforms furthers Red Hat's open hybrid cloud vision of an integrated and converged Red Hat Storage and Red Hat Enterprise Virtualization node that serves both compute and storage resources.

Red Hat's Big Data Application and Integration Platforms

  • Red Hat JBoss Middleware - Red Hat JBoss Middleware provides enterprises with powerful technologies for creating and integrating big data-driven applications that are able to interact with new and emerging technologies like Hadoop or MongoDB. Big data is only valuable when businesses can extract information and respond intelligently. Red Hat JBoss Middleware solutions can populate large volumes and varieties of data quickly and reliably into Hadoop with high speed messaging technologies; simplify working with MongoDB through Hibernate OGM; process large volumes of data quickly and easily with Red Hat JBoss Data Grid; access Hadoop along with your traditional data sources with JBoss Enterprise Data Services Platform; and identify opportunities and threats through pattern recognition with JBoss Enterprise BRMS. Red Hat's middleware portfolio is well-suited to help enterprises seize the opportunities of big data.

Big Data Partnerships

  • Big Data Ecosystem Partners - To provide a comprehensive big data solution set to enterprises, Red Hat plans to partner with leading big data software and hardware providers to offer interoperability. Development of certified and documented reference architectures are expected to allow users to integrate and install comprehension enterprise big data solutions.
  • Enterprise Partners - Red Hat anticipates enabling the delivery of a comprehensive big data solution to its customers through leading enterprise integration partners utilizing the reference architectures developed by Red Hat and its big data ecosystem partners.

More Stories By Pat Romanski

News Desk compiles and publishes breaking news stories, press releases and latest news articles as they happen.

@CloudExpo Stories
In his general session at 18th Cloud Expo, Lee Atchison, Principal Cloud Architect and Advocate at New Relic, discussed cloud as a ‘better data center’ and how it adds new capacity (faster) and improves application availability (redundancy). The cloud is a ‘Dynamic Tool for Dynamic Apps’ and resource allocation is an integral part of your application architecture, so use only the resources you need and allocate /de-allocate resources on the fly.
Your business relies on your applications and your employees to stay in business. Whether you develop apps or manage business critical apps that help fuel your business, what happens when users experience sluggish performance? You and all technical teams across the organization – application, network, operations, among others, as well as, those outside the organization, like ISPs and third-party providers – are called in to solve the problem.
Information technology is an industry that has always experienced change, and the dramatic change sweeping across the industry today could not be truthfully described as the first time we've seen such widespread change impacting customer investments. However, the rate of the change, and the potential outcomes from today's digital transformation has the distinct potential to separate the industry into two camps: Organizations that see the change coming, embrace it, and successful leverage it; and...
SYS-CON Events announced today that Roundee / LinearHub will exhibit at the WebRTC Summit at @ThingsExpo, which will take place on November 1–3, 2016, at the Santa Clara Convention Center in Santa Clara, CA. LinearHub provides Roundee Service, a smart platform for enterprise video conferencing with enhanced features such as automatic recording and transcription service. Slack users can integrate Roundee to their team via Slack’s App Directory, and '/roundee' command lets your video conference ...
A strange thing is happening along the way to the Internet of Things, namely far too many devices to work with and manage. It has become clear that we'll need much higher efficiency user experiences that can allow us to more easily and scalably work with the thousands of devices that will soon be in each of our lives. Enter the conversational interface revolution, combining bots we can literally talk with, gesture to, and even direct with our thoughts, with embedded artificial intelligence, wh...
Digital transformation is too big and important for our future success to not understand the rules that apply to it. The first three rules for winning in this age of hyper-digital transformation are: Advantages in speed, analytics and operational tempos must be captured by implementing an optimized information logistics system (OILS) Real-time operational tempos (IT, people and business processes) must be achieved Businesses that can "analyze data and act and with speed" will dominate those t...
The Jevons Paradox suggests that when technological advances increase efficiency of a resource, it results in an overall increase in consumption. Writing on the increased use of coal as a result of technological improvements, 19th-century economist William Stanley Jevons found that these improvements led to the development of new ways to utilize coal. In his session at 19th Cloud Expo, Mark Thiele, Chief Strategy Officer for Apcera, will compare the Jevons Paradox to modern-day enterprise IT, e...
There are several IoTs: the Industrial Internet, Consumer Wearables, Wearables and Healthcare, Supply Chains, and the movement toward Smart Grids, Cities, Regions, and Nations. There are competing communications standards every step of the way, a bewildering array of sensors and devices, and an entire world of competing data analytics platforms. To some this appears to be chaos. In this power panel at @ThingsExpo, moderated by Conference Chair Roger Strukhoff, Bradley Holt, Developer Advocate a...
More and more companies are looking to microservices as an architectural pattern for breaking apart applications into more manageable pieces so that agile teams can deliver new features quicker and more effectively. What this pattern has done more than anything to date is spark organizational transformations, setting the foundation for future application development. In practice, however, there are a number of considerations to make that go beyond simply “build, ship, and run,” which changes ho...
SYS-CON Events announced today the Kubernetes and Google Container Engine Workshop, being held November 3, 2016, in conjunction with @DevOpsSummit at 19th Cloud Expo at the Santa Clara Convention Center in Santa Clara, CA. This workshop led by Sebastian Scheele introduces participants to Kubernetes and Google Container Engine (GKE). Through a combination of instructor-led presentations, demonstrations, and hands-on labs, students learn the key concepts and practices for deploying and maintainin...
Without a clear strategy for cost control and an architecture designed with cloud services in mind, costs and operational performance can quickly get out of control. To avoid multiple architectural redesigns requires extensive thought and planning. Boundary (now part of BMC) launched a new public-facing multi-tenant high resolution monitoring service on Amazon AWS two years ago, facing challenges and learning best practices in the early days of the new service. In his session at 19th Cloud Exp...
SYS-CON Events announced today that Bsquare has been named “Silver Sponsor” of SYS-CON's @ThingsExpo, which will take place on November 1–3, 2016, at the Santa Clara Convention Center in Santa Clara, CA. For more than two decades, Bsquare has helped its customers extract business value from a broad array of physical assets by making them intelligent, connecting them, and using the data they generate to optimize business processes.
SYS-CON Events announced today that ReadyTalk, a leading provider of online conferencing and webinar services, has been named Vendor Presentation Sponsor at the 19th International Cloud Expo, which will take place on November 1–3, 2016, at the Santa Clara Convention Center in Santa Clara, CA. ReadyTalk delivers audio and web conferencing services that inspire collaboration and enable the Future of Work for today’s increasingly digital and mobile workforce. By combining intuitive, innovative tec...
There is growing need for data-driven applications and the need for digital platforms to build these apps. In his session at 19th Cloud Expo, Muddu Sudhakar, VP and GM of Security & IoT at Splunk, will cover different PaaS solutions and Big Data platforms that are available to build applications. In addition, AI and machine learning are creating new requirements that developers need in the building of next-gen apps. The next-generation digital platforms have some of the past platform needs a...
While DevOps promises a better and tighter integration among an organization’s development and operation teams and transforms an application life cycle into a continual deployment, Chef and Azure together provides a speedy, cost-effective and highly scalable vehicle for realizing the business values of this transformation. In his session at @DevOpsSummit at 19th Cloud Expo, Yung Chou, a Technology Evangelist at Microsoft, will present a unique opportunity to witness how Chef and Azure work tog...
Cognitive Computing is becoming the foundation for a new generation of solutions that have the potential to transform business. Unlike traditional approaches to building solutions, a cognitive computing approach allows the data to help determine the way applications are designed. This contrasts with conventional software development that begins with defining logic based on the current way a business operates. In her session at 18th Cloud Expo, Judith S. Hurwitz, President and CEO of Hurwitz & ...
Fact is, enterprises have significant legacy voice infrastructure that’s costly to replace with pure IP solutions. How can we bring this analog infrastructure into our shiny new cloud applications? There are proven methods to bind both legacy voice applications and traditional PSTN audio into cloud-based applications and services at a carrier scale. Some of the most successful implementations leverage WebRTC, WebSockets, SIP and other open source technologies. In his session at @ThingsExpo, Da...
The Internet of Things can drive efficiency for airlines and airports. In their session at @ThingsExpo, Shyam Varan Nath, Principal Architect with GE, and Sudip Majumder, senior director of development at Oracle, will discuss the technical details of the connected airline baggage and related social media solutions. These IoT applications will enhance travelers' journey experience and drive efficiency for the airlines and the airports. The session will include a working demo and a technical d...
Almost two-thirds of companies either have or soon will have IoT as the backbone of their business in 2016. However, IoT is far more complex than most firms expected. How can you not get trapped in the pitfalls? In his session at @ThingsExpo, Tony Shan, a renowned visionary and thought leader, will introduce a holistic method of IoTification, which is the process of IoTifying the existing technology and business models to adopt and leverage IoT. He will drill down to the components in this fra...
I'm a lonely sensor. I spend all day telling the world how I'm feeling, but none of the other sensors seem to care. I want to be connected. I want to build relationships with other sensors to be more useful for my human. I want my human to understand that when my friends next door are too hot for a while, I'll soon be flaming. And when all my friends go outside without me, I may be left behind. Don't just log my data; use the relationship graph. In his session at @ThingsExpo, Ryan Boyd, Engi...