|By Stephen E. Arnold||
|November 17, 2008 12:00 AM EST||
Stephen E. Arnold's Blog
Google has shifted from solving problems in distributed, massively parallel computing to developing next-generation cloud-centric applications. Google can, with the deployment of software, deliver global services that other companies cannot match in terms of speed of deployment, operation, and enhancement.
Cloud computing has become commonplace. Amazon has pumped steroids into the Amazon Web Services product line. Microsoft executives have been providing forecasts of a bold new service offering. Other vendors blasting off from mother earth to loftier realms include IBM, Intel, Rackspace, and other big name firms.
One of the most interesting documents I have read in months is a forthcoming technical paper from Microsoft’s Albert Greenberg, Paranta Lahiri, David Maltz, Parveen Patel, and Sudipta Sengupta. The paper is available from the ACM as document 978-1-60558-181-1/08/08. I have a hard copy in my hand, and I can’t locate a valid link to an online version. The ACM or a for fee database may help you get this document. In a nutshell, “Towards a Next Generation Data Center Architecture: Scalability and Commoditization” explains some of the technical innovations Microsoft is implementing to handle cloud-based, high-demand, high-availability applications. Some of the information in the paper surprised me. The innovations provide a good indication of the problems Microsoft faced in its older, pre-2008 data centers. It was clear to me that Microsoft is making progress, and some of the methods echo actions Google took as long ago as 1998.
What put the Amazon and Microsoft cloud innovations into sharp relief for me was US2008/0262828 “Encoding and Adaptive Scalable Accessing of Distributed Models.” You can download a copy of this document from the easy-to-use USPTO system. Start here to obtain the full text and diagrams for this patent application. Keep in mind that a patent application does not mean that Google has or will implement the systems and methods disclosed. What the patent application provides is a peep hole through which we can look at some of the thinking that Google is doing with regard to a particular technical issue. The peep hole may be small, but what I saw when I read the document and reviewed the drawings last night (October 24, 2008) sparked my thinking.
Before offering my opinion, let’s look at the abstract for this invention, filed in February 2006 in a provisional application. Keep in mind that we are looking in the rear view mirror here, not at where Google might be today. This historical benchmark is significant when you compare what Amazon and Microsoft are doing to deal with the cloud computing revolution that is gaining momentum. Here’s Google’s summary of the invention:
Systems, methods, and apparatus for accessing distributed models in automated machine processing, including using large language models in machine translation, speech recognition and other applications.
In typical Google style, there’s a certain economy to the description of an invention involving such technical luminaries as Jeff Dean and 12 other Googlers. The focus of the invention is on-the-fly machine translation. However, the inventors make it clear that the precepts of this invention can be applied to other applications as well. As you may know, Google has expanded its online translation capability in the last few months. If you have not explored this service, navigate to http://translate.google.com and try out the system.
The claims for this patent document are somewhat more specific. I can’t run through the 91 claims in this patent document. I can highlight one, and I will leave review of the other 90 to you. Claim 5 asserted:
The system of claim 4, wherein: the translation server comprises: a plurality of segment translation servers each operable to communicate with the translation model server, the language model servers and replica servers, each segment translation server operable to translate one segment of the source text into the target language, a translation front end to receive the source text and to divide the source text into a plurality of segments in the source language, and a load balancing module in communication with the translation front end to receive the segments of the source text and operable to distribute the segments to the segments to the segment translation servers for translation based on work load at the segment translation servers, the load balancing module further operable to direct translated segments in the target language from the segment translation servers to the translation front end.
The claim makes reasonably clear the basic nesting architecture of Google’s architecture. What impressed me is that this patent document, like other recent Google applications, makes use of an infrastructure as platform. The computational and input output tasks are simply not an issue. Google pretty clearly feels it has the horsepower to handle ad hoc translation in real time without worrying about how data are shoved around within the system. As a result, higher order applications that were impossible even for certain large government agencies can be made available without much foot dragging. I find this remarkable.
This patent document, if Google is doing what the inventors appear to be saying, is significantly different from the innovations I just mentioned from such competitors as Amazon and Microsoft. Google in my opinion is making it clear that it has a multi-year lead in cloud computing.
The thoughts that I noted as I worked thorough the 38 pages of small print in this patent document were:
- Google has shifted from solving problems in distributed, massively parallel computing to developing next-generation cloud-centric applications. Machine translation in real time for a global audience for free means heavy demand. This invention essentially said to me, “No problem.”
- Google’s infrastructure will become more capable as Google deploys new CPUs and faster storage devices. Google, therefore, can use its commodity approach to hardware and experience significant performance gains without spending for exotic gizmos or try to hack around bottlenecks such as those identified in the Microsoft paper referenced above.
- Google can, with the deployment of software, deliver global services that other companies cannot match in terms of speed of deployment, operation, and enhancement.
I may be wrong and I often am but I think Google is not content with its present lead over its rivals. I think this patent document is an indication that Google can put its foot on the gas pedal at any time and operate in a dimension that other companies cannot. Do you agree? Disagree? Let me learn where I am off base. Your view is important because I am finishing a write up for Infonortics about Google and publishing. Help me think straight. I even invite Cyrus to chime in. The drawings in this patent application are among Google’s best that I have seen.
|jeffhardy 11/24/08 11:43:02 AM EST|
Cloud Computing Fact and Fiction
In mid-November I participated in a session at PubCon regardint Cloud Computing. My goal was to cut through the hype and buzz talk to articulate the real potential benefits and debunk false claims. I got a lot of feedback. So much so that I wrote a follow up article:
It is important that we remember what Cloud Computing is and what it isn't.
|Jeremy Geelan 10/28/08 04:45:00 AM EDT|
Even though Google's maybe the elephant in the cloud, there are at least 49 others competing already in the cloud computing space including not just Amazon and Microsoft but also Akamai, Force.com, IBM, Sun, VMware and a host of others. I had a first shot at a Top Fifty list here: http://cloudcomputing.sys-con.com/node/665165
All major researchers estimate there will be tens of billions devices - computers, smartphones, tablets, and sensors - connected to the Internet by 2020. This number will continue to grow at a rapid pace for the next several decades. With major technology companies and startups seriously embracing IoT strategies, now is the perfect time to attend @ThingsExpo, November 3-5, 2015, at the Santa Clara Convention Center in Santa Clara, CA. Learn what is going on, contribute to the discussions, and e...
Sep. 3, 2015 08:30 PM EDT Reads: 137
Containers are not new, but renewed commitments to performance, flexibility, and agility have propelled them to the top of the agenda today. By working without the need for virtualization and its overhead, containers are seen as the perfect way to deploy apps and services across multiple clouds. Containers can handle anything from file types to operating systems and services, including microservices. What are microservices? Unlike what the name implies, microservices are not necessarily small,...
Sep. 3, 2015 08:00 PM EDT Reads: 161
Puppet Labs is pleased to share the findings from our 2015 State of DevOps Survey. We have deepened our understanding of how DevOps enables IT performance and organizational performance, based on responses from more than 20,000 technical professionals we’ve surveyed over the past four years. The 2015 State of DevOps Report reveals high-performing IT organizations deploy 30x more frequently with 200x shorter lead times. They have 60x fewer failures and recover 168x faster
Sep. 3, 2015 07:15 PM EDT Reads: 103
This Enterprise Strategy Group lab validation report of the NEC Express5800/R320 server with Intel® Xeon® processor presents the benefits of 99.999% uptime NEC fault-tolerant servers that lower overall virtualized server total cost of ownership. This report also includes survey data on the significant costs associated with system outages impacting enterprise and web applications. Click Here to Download Report Now!
Sep. 3, 2015 05:30 PM EDT Reads: 361
SYS-CON Events announced today that G2G3 will exhibit at SYS-CON's @DevOpsSummit Silicon Valley, which will take place on November 3–5, 2015, at the Santa Clara Convention Center in Santa Clara, CA. Based on a collective appreciation for user experience, design, and technology, G2G3 is uniquely qualified and motivated to redefine how organizations and people engage in an increasingly digital world.
Sep. 3, 2015 05:00 PM EDT Reads: 544
Too often with compelling new technologies market participants become overly enamored with that attractiveness of the technology and neglect underlying business drivers. This tendency, what some call the “newest shiny object syndrome,” is understandable given that virtually all of us are heavily engaged in technology. But it is also mistaken. Without concrete business cases driving its deployment, IoT, like many other technologies before it, will fade into obscurity.
Sep. 3, 2015 04:30 PM EDT Reads: 429
The 3rd International WebRTC Summit, to be held Nov. 4–6, 2014, at the Santa Clara Convention Center in Santa Clara, CA, announces that its Call for Papers is now open. Topics include all aspects of improving IT delivery by eliminating waste through automated business models leveraging cloud technologies. WebRTC Summit is co-located with 15th International Cloud Expo, 6th International Big Data Expo, 3rd International DevOps Summit and 2nd Internet of @ThingsExpo. WebRTC (Web-based Real-Time Com...
Sep. 3, 2015 03:00 PM EDT Reads: 1,605
As more and more data is generated from a variety of connected devices, the need to get insights from this data and predict future behavior and trends is increasingly essential for businesses. Real-time stream processing is needed in a variety of different industries such as Manufacturing, Oil and Gas, Automobile, Finance, Online Retail, Smart Grids, and Healthcare. Azure Stream Analytics is a fully managed distributed stream computation service that provides low latency, scalable processing of ...
Sep. 3, 2015 02:45 PM EDT Reads: 387
SYS-CON Events announced today that DataClear Inc. will exhibit at the 17th International Cloud Expo®, which will take place on November 3–5, 2015, at the Santa Clara Convention Center in Santa Clara, CA. The DataClear ‘BlackBox’ is the only solution that moves your PC, browsing and data out of the United States and away from prying (and spying) eyes. Its solution automatically builds you a clean, on-demand, virus free, new virtual cloud based PC outside of the United States, and wipes it clean...
Sep. 3, 2015 02:45 PM EDT Reads: 481
In 2014, the market witnessed a massive migration to the cloud as enterprises finally overcame their fears of the cloud’s viability, security, etc. Over the past 18 months, AWS, Google and Microsoft have waged an ongoing battle through a wave of price cuts and new features. For IT executives, sorting through all the noise to make the best cloud investment decisions has become daunting. Enterprises can and are moving away from a "one size fits all" cloud approach. The new competitive field has ...
Sep. 3, 2015 02:45 PM EDT Reads: 174
SYS-CON Events announced today that HPM Networks will exhibit at the 17th International Cloud Expo®, which will take place on November 3–5, 2015, at the Santa Clara Convention Center in Santa Clara, CA. For 20 years, HPM Networks has been integrating technology solutions that solve complex business challenges. HPM Networks has designed solutions for both SMB and enterprise customers throughout the San Francisco Bay Area.
Sep. 3, 2015 02:30 PM EDT Reads: 963
With the proliferation of connected devices underpinning new Internet of Things systems, Brandon Schulz, Director of Luxoft IoT – Retail, will be looking at the transformation of the retail customer experience in brick and mortar stores in his session at @ThingsExpo. Questions he will address include: Will beacons drop to the wayside like QR codes, or be a proximity-based profit driver? How will the customer experience change in stores of all types when everything can be instrumented and a...
Sep. 3, 2015 02:30 PM EDT Reads: 512
Introducing Containers & Microservices Bootcamp at @CloudExpo Silicon Valley | #Containers #Microservices
SYS-CON Events announced today the Containers & Microservices Bootcamp, being held November 3-4, 2015, in conjunction with 17th Cloud Expo, @ThingsExpo, and @DevOpsSummit at the Santa Clara Convention Center in Santa Clara, CA. This is your chance to get started with the latest technology in the industry. Combined with real-world scenarios and use cases, the Containers and Microservices Bootcamp, led by Janakiram MSV, a Microsoft Regional Director, will include presentations as well as hands-on...
Sep. 3, 2015 02:15 PM EDT Reads: 415
Moving an existing on-premise infrastructure into the cloud can be a complex and daunting proposition. It is critical to understand the benefits as well as the challenges associated with either a full or hybrid approach. In his session at 17th Cloud Expo, Richard Weiss, Principal Consultant at Pythian, will present a roadmap that can be leveraged by any organization to plan, analyze, evaluate and execute on a cloud migration solution. He will review the five major cloud transformation phases a...
Sep. 3, 2015 01:45 PM EDT
Amazon and Google have built software-defined data centers (SDDCs) that deliver massively scalable services with great efficiency. Yet, building SDDCs has proven to be a near impossibility for ‘normal’ companies without hyper-scale resources. In his session at 17th Cloud Expo, David Cauthron, founder and chief executive officer of Nimboxx, will discuss the evolution of virtualization (hardware, application, memory, storage) and how commodity / open source hyper converged infrastructure (HCI) so...
Sep. 3, 2015 01:15 PM EDT Reads: 140
SYS-CON Events announced today that Pythian, a global IT services company specializing in helping companies leverage disruptive technologies to optimize revenue-generating systems, has been named “Bronze Sponsor” of SYS-CON's 17th Cloud Expo, which will take place on November 3–5, 2015, at the Santa Clara Convention Center in Santa Clara, CA. Founded in 1997, Pythian is a global IT services company that helps companies compete by adopting disruptive technologies such as cloud, Big Data, advance...
Sep. 3, 2015 01:00 PM EDT Reads: 368
In today's digital world, change is the one constant. Disruptive innovations like cloud, mobility, social media, and the Internet of Things have reshaped the market and set new standards in customer expectations. To remain competitive, businesses must tap the potential of emerging technologies and markets through the rapid release of new products and services. However, the rigid and siloed structures of traditional IT platforms and processes are slowing them down – resulting in lengthy delivery ...
Sep. 3, 2015 12:45 PM EDT Reads: 638
In his session at @ThingsExpo, Lee Williams, a producer of the first smartphones and tablets, will talk about how he is now applying his experience in mobile technology to the design and development of the next generation of Environmental and Sustainability Services at ETwater. He will explain how M2M controllers work through wirelessly connected remote controls; and specifically delve into a retrofit option that reverse-engineers control codes of existing conventional controller systems so the...
Sep. 3, 2015 12:00 PM EDT Reads: 255
The web app is agile. The REST API is agile. The testing and planning are agile. But alas, data infrastructures certainly are not. Once an application matures, changing the shape or indexing scheme of data often forces at best a top down planning exercise and at worst includes schema changes that force downtime. The time has come for a new approach that fundamentally advances the agility of distributed data infrastructures. Come learn about a new solution to the problems faced by software organ...
Sep. 3, 2015 11:00 AM EDT Reads: 137
U.S. companies are desperately trying to recruit and hire skilled software engineers and developers, but there is simply not enough quality talent to go around. Tiempo Development is a nearshore software development company. Our headquarters are in AZ, but we are a pioneer and leader in outsourcing to Mexico, based on our three software development centers there. We have a proven process and we are experts at providing our customers with powerful solutions. We transform ideas into reality.
Sep. 3, 2015 10:45 AM EDT Reads: 574