|By Jeremy Geelan||
|December 1, 2008 03:00 PM EST||
Cloud-based tools, including large-scale data-intensive computing as offered by Hadoop, are key to the rise and rise of cloud computing. In this wide-ranging Exclusive Q&A with SYS-CON's Cloud Computing Journal, the Director of Grid Services at Yahoo! - Rob Weltman - explains to Jeremy Geelan, Conference Chair of SYS-CON's 1st International Cloud Computing Conference & Expo held last week in San Jose, CA, how analyzing and learning from ever-growing volumes of business data is essential to continuously refining and improving service offerings.
Cloud Computing Journal: Yahoo! has been the largest contributor to the Hadoop project and uses Hadoop extensively in its Web search and advertising businesses. Can you explain a little of the background to that?
Rob Weltman: Yahoo! Search (and before it Inktomi) was a pioneer in using large clusters of commodity computers to speed up the crawling and indexing of Web sites. While working on the architecture and design of the next generation of Web Search crawling and indexing, we came in touch with Doug Cutting and the open source Lucene project for text indexing/search. Lucene contained a distributed file system with integrated computation using the map-reduce paradigm. It looked very promising and appropriate for many data-intensive applications. Hadoop was then split out as its own project. Yahoo! supported Hadoop in a big way, both in contributing to its development as an open source project and in applying it to solve many large-scale data/computation problems in the company.
Hadoop has matured at an amazingly fast pace. From a 20-node cluster two years ago, to many 2,000-node clusters today; from a somewhat embarrassing terasort (a benchmark) performance to the terasort leader; from a no-access control to user- and group-owned files and directories. There is now a high-level language - Pig - that allows you to express complex operations on data in an intuitive way and have them translated into Hadoop map-reduce jobs.
In 2007, Hadoop at Yahoo! was used primarily for research - analyzing enormous volumes of data to find the best algorithms and parameters for selecting search results or ads to present to users. Now it is also a central component in many production operations, including Web Search, ad serving, and personalization.
Cloud Computing Journal: Are cloud-based tools like Hadoop the most important kinds of tools for the future, do you think?
RW: Being able to add capacity as needed without major software or infrastructure changes is clearly important for many organizations. Sharing resources and dynamically allocating more or less to various functions on demand is highly attractive as companies strive to control costs while the computing needs grow and shift. Analyzing and learning from ever-growing volumes of business data is essential to continuously refining and improving service offerings. The ability to quickly explore new algorithms and put them into production will be a competitive advantage for those with the resources to apply them. All of these speak to the importance of Cloud Computing,
Cloud Computing Journal: How important a role does Java play in the project? Is that because of the need to scale horizontally (and massively)?
RW: Hadoop supports programming and scripting in many languages. Hadoop, itself, is written in Java. The language provides strong support for the central infrastructure needs of system and network programming. There is a large body of experience in developing robust, performance-optimized, scalable platforms in Java.
Java provides portability to many hardware and software environments however Hadoop's horizontal scalability is not a result of the choice of language but rather of a design that is strongly focused on fault-tolerance and distribution.
Cloud Computing Journal: Is the Yahoo! Search Webmap still the world's largest Hadoop production application so far as you are aware? Can you share some size data about Webmap with us?
RW: Yes, as far as I know, the Yahoo! WebMap is the largest Hadoop application in production. It uses 2,000+ computers and is still continuously growing. It produces 300TB of data per run, including 1.2 trillion links.
Cloud Computing Journal: How important are Hadoop clusters to Yahoo! Overall? Do your Web search queries depend on them?
RW: Hadoop isn't directly involved in responding to queries typed in by users, but it is responsible for much of the backend work that produces the indexes used to service those queries. If the Hadoop clusters were down, the quality of search results would quickly degrade as the indexes became stale.
Cloud Computing Journal: Who else besides Yahoo! uses Hadoop to run large distributed computations?
RW: Many of the major Hadoop users are listed at http://wiki.apache.org/hadoop/PoweredBy. Facebook has several hundred nodes in a cluster for backend processing and analysis. Quantcast has several thousand cores in a very large cluster. Many companies, including AOL, A9 (Amazon), and IBM have deployed somewhat smaller clusters. It's likely that almost all of the uses involve large quantities of data.
Cloud Computing Journal: Can Hadoop be run on Amazon EC2?
RW: Absolutely! There is a ready-to-run AMI (virtual machine definition for EC2) for Hadoop. Among many others, Powerset (now owned by Microsoft) runs on EC2.
Cloud Computing Journal: What about Sun's Grid Engine - can it also be run on that?
RW: Yes, Hadoop works with Sun's Grid Engine but you lose the benefit of data locality (putting the computation of each piece of a distributed job near the data needed by that piece).
Cloud Computing Journal: Does the Hadoop team have any kind of a blog or forum?
RW: We have a blog at http://developer.yahoo.net/blogs/hadoop/. The team is also heavily engaged in the user and developer Hadoop mailing lists at hadoop.apache.org.
Cloud Computing Journal: Doug Cutting named it after his child's stuffed elephant. Is there any downside to an Enterprise IT tool having the name of a stuffed elephant?
RW: I did get some ribbing during the election period when I wore my Hadoop Summit t-shirt with the elephant on it, but I was able to clarify Hadoop's open source and non-partisan nature.
Cloud Computing Journal: What else have you and your team developed at Yahoo!, in terms of data-analytics applications for example?
RW: The Grid Computing development team at Yahoo! works on the Hadoop core software, the Pig high-level language, the ZooKeeper distributed coordination service, and the Chukwa monitoring and metric analysis system. In addition, it provides various Hadoop add-ons and tools to e.g. facilitate joining of very large data sets or to understand and improve the performance and efficiency of Hadoop jobs. We provide consulting to application teams that develop large-scale Hadoop programs (often involving feature extraction, modeling, optimization, and index creation) but do not produce them ourselves.
In his session at @ThingsExpo, Lee Williams, a producer of the first smartphones and tablets, will talk about how he is now applying his experience in mobile technology to the design and development of the next generation of Environmental and Sustainability Services at ETwater. He will explain how M2M controllers work through wirelessly connected remote controls; and specifically delve into a retrofit option that reverse-engineers control codes of existing conventional controller systems so the...
Sep. 2, 2015 12:45 PM EDT Reads: 198
DevOps Summit, taking place Nov 3-5, 2015, at the Santa Clara Convention Center in Santa Clara, CA, is co-located with 17th Cloud Expo and will feature technical sessions from a rock star conference faculty and the leading industry players in the world. The widespread success of cloud computing is driving the DevOps revolution in enterprise IT. Now as never before, development teams must communicate and collaborate in a dynamic, 24/7/365 environment. There is no time to wait for long development...
Sep. 2, 2015 12:45 PM EDT Reads: 1,551
Moving an existing on-premise infrastructure into the cloud can be a complex and daunting proposition. It is critical to understand the benefits as well as the challenges associated with either a full or hybrid approach. In his session at 17th Cloud Expo, Richard Weiss, Principal Consultant at Pythian, will present a roadmap that can be leveraged by any organization to plan, analyze, evaluate and execute on a cloud migration solution. He will review the five major cloud transformation phases a...
Sep. 2, 2015 12:21 PM EDT
Too often with compelling new technologies market participants become overly enamored with that attractiveness of the technology and neglect underlying business drivers. This tendency, what some call the “newest shiny object syndrome,” is understandable given that virtually all of us are heavily engaged in technology. But it is also mistaken. Without concrete business cases driving its deployment, IoT, like many other technologies before it, will fade into obscurity.
Sep. 2, 2015 12:15 PM EDT Reads: 408
Amazon and Google have built software-defined data centers (SDDCs) that deliver massively scalable services with great efficiency. Yet, building SDDCs has proven to be a near impossibility for ‘normal’ companies without hyper-scale resources. In his session at 17th Cloud Expo, David Cauthron, founder and chief executive officer of Nimboxx, will discuss the evolution of virtualization (hardware, application, memory, storage) and how commodity / open source hyper converged infrastructure (HCI) so...
Sep. 2, 2015 12:00 PM EDT Reads: 121
Mobile, social, Big Data, and cloud have fundamentally changed the way we live. “Anytime, anywhere” access to data and information is no longer a luxury; it’s a requirement, in both our personal and professional lives. For IT organizations, this means pressure has never been greater to deliver meaningful services to the business and customers.
Sep. 2, 2015 12:00 PM EDT Reads: 813
API-Driven Digital Healthcare Solution By @AkanaInc | @DevOpsSummit #API #IoT #DevOps #Microservices
Akana has announced the availability of the new Akana Healthcare Solution. The API-driven solution helps healthcare organizations accelerate their transition to being secure, digitally interoperable businesses. It leverages the Health Level Seven International Fast Healthcare Interoperability Resources (HL7 FHIR) standard to enable broader business use of medical data. Akana developed the Healthcare Solution in response to healthcare businesses that want to increase electronic, multi-device acce...
Sep. 2, 2015 12:00 PM EDT Reads: 266
17th Cloud Expo, taking place Nov 3-5, 2015, at the Santa Clara Convention Center in Santa Clara, CA, will feature technical sessions from a rock star conference faculty and the leading industry players in the world. Cloud computing is now being embraced by a majority of enterprises of all sizes. Yesterday's debate about public vs. private has transformed into the reality of hybrid cloud: a recent survey shows that 74% of enterprises have a hybrid cloud strategy. Meanwhile, 94% of enterprises ar...
Sep. 2, 2015 12:00 PM EDT Reads: 1,560
The 3rd International WebRTC Summit, to be held Nov. 4–6, 2014, at the Santa Clara Convention Center in Santa Clara, CA, announces that its Call for Papers is now open. Topics include all aspects of improving IT delivery by eliminating waste through automated business models leveraging cloud technologies. WebRTC Summit is co-located with 15th International Cloud Expo, 6th International Big Data Expo, 3rd International DevOps Summit and 2nd Internet of @ThingsExpo. WebRTC (Web-based Real-Time Com...
Sep. 2, 2015 11:45 AM EDT Reads: 1,546
In 2014, the market witnessed a massive migration to the cloud as enterprises finally overcame their fears of the cloud’s viability, security, etc. Over the past 18 months, AWS, Google and Microsoft have waged an ongoing battle through a wave of price cuts and new features. For IT executives, sorting through all the noise to make the best cloud investment decisions has become daunting. Enterprises can and are moving away from a "one size fits all" cloud approach. The new competitive field has ...
Sep. 2, 2015 11:45 AM EDT Reads: 105
In today's digital world, change is the one constant. Disruptive innovations like cloud, mobility, social media, and the Internet of Things have reshaped the market and set new standards in customer expectations. To remain competitive, businesses must tap the potential of emerging technologies and markets through the rapid release of new products and services. However, the rigid and siloed structures of traditional IT platforms and processes are slowing them down – resulting in lengthy delivery ...
Sep. 2, 2015 11:45 AM EDT Reads: 616
The 17th International Cloud Expo has announced that its Call for Papers is open. 17th International Cloud Expo, to be held November 3-5, 2015, at the Santa Clara Convention Center in Santa Clara, CA, brings together Cloud Computing, APM, APIs, Microservices, Security, Big Data, Internet of Things, DevOps and WebRTC to one location. With cloud computing driving a higher percentage of enterprise IT budgets every year, it becomes increasingly important to plant your flag in this fast-expanding bu...
Sep. 2, 2015 11:30 AM EDT Reads: 1,627
Enterprises can achieve rigorous IT security as well as improved DevOps practices and Cloud economics by taking a new, cloud-native approach to application delivery. Because the attack surface for cloud applications is dramatically different than for highly controlled data centers, a disciplined and multi-layered approach that spans all of your processes, staff, vendors and technologies is required. This may sound expensive and time consuming to achieve as you plan how to move selected applicati...
Sep. 2, 2015 11:30 AM EDT
With major technology companies and startups seriously embracing IoT strategies, now is the perfect time to attend @ThingsExpo in Silicon Valley. Learn what is going on, contribute to the discussions, and ensure that your enterprise is as "IoT-Ready" as it can be! Internet of @ThingsExpo, taking place Nov 3-5, 2015, at the Santa Clara Convention Center in Santa Clara, CA, is co-located with 17th Cloud Expo and will feature technical sessions from a rock star conference faculty and the leading in...
Sep. 2, 2015 11:30 AM EDT Reads: 1,995
Introducing Containers & Microservices Bootcamp at @CloudExpo Silicon Valley | #Containers #Microservices
SYS-CON Events announced today the Containers & Microservices Bootcamp, being held November 3-4, 2015, in conjunction with 17th Cloud Expo, @ThingsExpo, and @DevOpsSummit at the Santa Clara Convention Center in Santa Clara, CA. This is your chance to get started with the latest technology in the industry. Combined with real-world scenarios and use cases, the Containers and Microservices Bootcamp, led by Janakiram MSV, a Microsoft Regional Director, will include presentations as well as hands-on...
Sep. 2, 2015 11:30 AM EDT Reads: 380
The 5th International DevOps Summit, co-located with 17th International Cloud Expo – being held November 3-5, 2015, at the Santa Clara Convention Center in Santa Clara, CA – announces that its Call for Papers is open. Born out of proven success in agile development, cloud computing, and process automation, DevOps is a macro trend you cannot afford to miss. From showcase success stories from early adopters and web-scale businesses, DevOps is expanding to organizations of all sizes, including the ...
Sep. 2, 2015 11:15 AM EDT Reads: 1,611
With the proliferation of connected devices underpinning new Internet of Things systems, Brandon Schulz, Director of Luxoft IoT – Retail, will be looking at the transformation of the retail customer experience in brick and mortar stores in his session at @ThingsExpo. Questions he will address include: Will beacons drop to the wayside like QR codes, or be a proximity-based profit driver? How will the customer experience change in stores of all types when everything can be instrumented and a...
Sep. 2, 2015 11:15 AM EDT Reads: 493
This Enterprise Strategy Group lab validation report of the NEC Express5800/R320 server with Intel® Xeon® processor presents the benefits of 99.999% uptime NEC fault-tolerant servers that lower overall virtualized server total cost of ownership. This report also includes survey data on the significant costs associated with system outages impacting enterprise and web applications. Click Here to Download Report Now!
Sep. 2, 2015 10:00 AM EDT Reads: 274
U.S. companies are desperately trying to recruit and hire skilled software engineers and developers, but there is simply not enough quality talent to go around. Tiempo Development is a nearshore software development company. Our headquarters are in AZ, but we are a pioneer and leader in outsourcing to Mexico, based on our three software development centers there. We have a proven process and we are experts at providing our customers with powerful solutions. We transform ideas into reality.
Sep. 2, 2015 09:45 AM EDT Reads: 559
As more and more data is generated from a variety of connected devices, the need to get insights from this data and predict future behavior and trends is increasingly essential for businesses. Real-time stream processing is needed in a variety of different industries such as Manufacturing, Oil and Gas, Automobile, Finance, Online Retail, Smart Grids, and Healthcare. Azure Stream Analytics is a fully managed distributed stream computation service that provides low latency, scalable processing of ...
Sep. 2, 2015 08:30 AM EDT Reads: 308