Welcome!

@CloudExpo Authors: Zakia Bouachraoui, Liz McMillan, Yeshim Deniz, Jason Bloomberg, Pat Romanski

Related Topics: @CloudExpo, Java IoT, Microservices Expo, Open Source Cloud, Containers Expo Blog, Apache

@CloudExpo: Article

Rackspace & Hortonworks Cut Strategic Deal

The announcement says the “joint effort will pursue an OpenStack-based Hadoop solution for the public and private cloud”

Rackspace has cut a strategic agreement with Hortonworks, the open source Apache Hadoop start-up, to resell its platform.

The two are also going to try to eliminate the complexities and time-consuming manual processes in Hadoop required to implement a Big Data solution.

The announcement says the "joint effort will pursue an OpenStack-based Hadoop solution for the public and private cloud, which can easily be deployed in minutes."

Gartner figures Big Data will drive $232 billion in IT spending through 2016.

Rackspace CTO John Engates allowed in a statement that "Running Hadoop on your own is complex, which is why we're excited about our development efforts with Hortonworks. We believe Hortonworks as a collaborator brings a substantial advantage in technology, services and experience that will clearly benefit customers. We have customers today that deploy Hadoop clusters on dedicated hardware at Rackspace with support from Hortonworks. By joining forces, we intend to turn Hadoop into an on-demand service running on the Rackspace open cloud and in clusters on private cloud infrastructure in our data centers or the customer's data center. The Hortonworks Data Platform packages the open source Apache version of Hadoop. That aligns with our vision of an open cloud future that eliminates fear of vendor lock-in, and allows customers to confidently invest in a technology for the long term."

The pair means to deliver the "gold standard" for Hadoop on OpenStack public and private clouds. The efforts will entail:

  • Rackspace releasing an OpenStack public cloud-based Hadoop service that's validated and supported by Hortonworks and enables customers to create a Big Data environment in a matter of minutes;
  • Customers engaging Hortonworks to support running Hadoop on Rackspace infrastructure, including its public cloud, private cloud inside Rackspace, and its dedicated servers;
  • Hortonworks providing customer guidance and support to customers who want to deploy Hortonworks on Rackspace Private Cloud Software in their own datacenter; and
  • Hortonworks and Rackspace joining the technical certification programs for Rackspace Private Cloud Software and the Hortonworks Data Platform, respectively.

Rackspace has been using Hadoop since 2008 for mission-critical uses: its Emails & Apps Division processes billions of e-mails a year to troubleshoot and diagnose customer issues; and its billing department analyzes cloud usage data daily to generate customer invoices.

Hortonworks is a leading contributor to Apache Hadoop and has been partnering with a number of companies, including Intel, to deliver solutions aimed at making an enterprise-grade Hadoop environment that can be deployed seamlessly. Intel is working with Rackspace and Hortonworks to optimize the widgetry on its Xeon platform.

More Stories By Maureen O'Gara

Maureen O'Gara the most read technology reporter for the past 20 years, is the Cloud Computing and Virtualization News Desk editor of SYS-CON Media. She is the publisher of famous "Billygrams" and the editor-in-chief of "Client/Server News" for more than a decade. One of the most respected technology reporters in the business, Maureen can be reached by email at maureen(at)sys-con.com or paperboy(at)g2news.com, and by phone at 516 759-7025. Twitter: @MaureenOGara

Comments (0)

Share your thoughts on this story.

Add your comment
You must be signed in to add a comment. Sign-in | Register

In accordance with our Comment Policy, we encourage comments that are on topic, relevant and to-the-point. We will remove comments that include profanity, personal attacks, racial slurs, threats of violence, or other inappropriate material that violates our Terms and Conditions, and will block users who make repeated violations. We ask all readers to expect diversity of opinion and to treat one another with dignity and respect.


CloudEXPO Stories
Andi Mann, Chief Technology Advocate at Splunk, is an accomplished digital business executive with extensive global expertise as a strategist, technologist, innovator, marketer, and communicator. For over 30 years across five continents, he has built success with Fortune 500 corporations, vendors, governments, and as a leading research analyst and consultant.
Extreme Computing is the ability to leverage highly performant infrastructure and software to accelerate Big Data, machine learning, HPC, and Enterprise applications. High IOPS Storage, low-latency networks, in-memory databases, GPUs and other parallel accelerators are being used to achieve faster results and help businesses make better decisions. In his session at 18th Cloud Expo, Michael O'Neill, Strategic Business Development at NVIDIA, focused on some of the unique ways extreme computing is being used on IBM Cloud, Amazon, and Microsoft Azure and how to gain access to these resources in the cloud... for FREE!
CI/CD is conceptually straightforward, yet often technically intricate to implement since it requires time and opportunities to develop intimate understanding on not only DevOps processes and operations, but likely product integrations with multiple platforms. This session intends to bridge the gap by offering an intense learning experience while witnessing the processes and operations to build from zero to a simple, yet functional CI/CD pipeline integrated with Jenkins, Github, Docker and Azure.
"We do one of the best file systems in the world. We learned how to deal with Big Data many years ago and we implemented this knowledge into our software," explained Jakub Ratajczak, Business Development Manager at MooseFS, in this SYS-CON.tv interview at 20th Cloud Expo, held June 6-8, 2017, at the Javits Center in New York City, NY.