Welcome!

@CloudExpo Authors: Elizabeth White, Zakia Bouachraoui, Liz McMillan, Pat Romanski, Roger Strukhoff

Related Topics: @CloudExpo, @ThingsExpo

@CloudExpo: Blog Post

Cloud: Datacenters, Meet Software! By @IoT2040 | @CloudExpo [#Cloud]

The Open Compute Project Sets the Direction

The PC revolution has redefined the notion of a computer over the past four decades. Now it might be time to redefine the notion of a collection of computers, that is, to redefine the notion of a datacenter.

Datacenters are thought of as big places. Some of the more recent plants used by mega-users like Google, Amazon, Microsoft, and Facebook have acres of land under roof, with many tens of thousands of individual systems and power requirements that would support a small city.

Even your friendly local, on-site enterprise datacenter is likely to be a big room with a big budget commitment and a lot of people hired to manage it

But what if a datacenter could fit in the corner of a room, or under a desk, or in the palm of your hand?

This seems to be the direction we're headed, as data loads simultaneously grow exponentially and become ever more distributed. This is also part of the vision I saw and heard outlined at the recent Open Compute Summit in San Jose.

Transparency as a Service
The summit was sponsored by The Open Compute Project Foundation, with a goal "to design and enable the delivery of the most efficient server, storage and data center hardware designs for scalable computing," according to its mission statement. Members strive to share ideas, specs, and intellectual property in an open environment. The foundation is keyed by Facebook and the company's commitment to transparency in how it builds out the massive datacenter infrastructure it requires.

One significant announcement at the summit was made by Vapor.io and company CEO Cole Crawford. The company aims for nothing less than utter transformation of the datacenter, starting with a programmable, open-source based management solution at the top of the stack.

Crawford and Chief Architect Steven White envision a modern, data-driven datacenter in which servers are "cattle, not pets," following the still-new concept of software-defined servers and centers. "The Open Data Center Runtime Environment is the first accepted contribution to the Open Compute Foundation using the reciprocal license thus ensuring that forks and branches won't exist," according to Vapor.io. "We did this to ensure that when you are interacting with your data center, you're communicating over a community owned, community standard.

The company's ultimate vision is a modern hardware configuration that brings new levels of efficiency and output to datacenters of all sizes.

Mobility & Then the IoT
The fast-growing global dataflow has mobility as today's primary driver. The proliferation of tablets and especially smartphones on a worldwide basis will cause the total amount of data being processed by the Internet to exceed a zettabyte (1 billion terabytes) annually this year or next. That's more than 30 terabytes per second.

Smartphone ownership will reach into the billions soon enough, and even in many developing countries, such as the Philippines, there are now more mobile phones than people.

But we ain't seen nuttin' yet-the Internet of Things (IoT) will be adding billions of new devices to the global Internet soon enough. Though much of the traffic it generates will be hyper-local (via Bluetooth and other short-range technologies), enough of it will be traveling along the Internet to increase global bandwidth to the dozens of zettabytes by the year 2020, according to Gartner and others.

Think of it as cloud computing to the nth degree in all dimensions. Think of the phrase made famous by Sun Microsystems-"the network is the computer"--extending out to "the edge of the network is your computer."

The edge of the network seems to me to be much like the edge of the universe, that is, there is no such thing to the single observer. One person's edge is another's center. Cyberspace expands outward from wherever you are, and you will expect the same performance (some day) for your single device or your enterprise no matter where you are.

Big, bulky, centralized data centers cannot provide this edge service ubiquitously and effectively. There is also the matter of energy consumption. Datacenters were using about 2% of all electricity consumption in the US in 2011-that number has certainly risen since then, although it did not rise as quickly as the EPA had originally estimated for the period 2007-11.

Focus on Power Consumption
But let's not get distracted by this particular metric. The big picture is one that features global power consumption and the aspiration of billions of people in developing nations to have better lives.

As I've written about many times, and about which much of our research at the Tau Institute is focused, developing nations typically consume 3 to 5% of the electricity per-person of the developed world.

We believe that an aggressive national commitment to IT is a primary indicator of sustained economic and societal growth. To achieve significant economic improvement therefore means we must achieve significant new efficiencies in power consumption.

Right Direction
The vision laid out by Vapor.io seems to be a positive step in this direction. Crawford says the technology, which already has its first customer in Indiana in the US, aims for a PUE of 1.1, compared to an industry average of 1.9 (PUE, or power usage effectiveness, is a simple measure of the ratio of total energy required by a datacenter divided by the amount used by the computing resources. The overhead is primarily eaten up by air-conditioning.

Crawford and team go further, asserting that a new metric needs to be put into place. The metric would be called performance per watt per dollar, or PWD.

New efficiencies and new metrics are one big part of the puzzle. Another big part takes us back to the question near the beginning of this article. What if I could hold a datacenter in my hand? When will I be able to do this?

For now, the direction is being set. The world will need more mega-datacenter technology in smaller urban spaces, as mobility and the IoT inexorably drive dataflows upward. It will also need as much cloud-driven technology within buildings and some day, per person, as we can imagine.

The Software World
The third big piece of the puzzle involves the software that's eating the world, in the phrase made notorious by Mark Andreessen in the Wall Street Journal in 2011. The world of cloud computing is a world of virtualization, containers, languages, platforms, architectures, and many things as-a-service.

It is a world that is not familiar to many people in the world of datacenters. A grand conversation is beginning to take place, and will need to intensify dramatically to sync up where the world of data is going and where the world of datacenters should be going.

Contact me on Twitter

Follow Cloud Expo on Twitter

More Stories By Roger Strukhoff

Roger Strukhoff (@IoT2040) is Executive Director of the Tau Institute for Global ICT Research, with offices in Illinois and Manila. He is Conference Chair of @CloudExpo & @ThingsExpo, and Editor of SYS-CON Media's CloudComputing BigData & IoT Journals. He holds a BA from Knox College & conducted MBA studies at CSU-East Bay.

Comments (0)

Share your thoughts on this story.

Add your comment
You must be signed in to add a comment. Sign-in | Register

In accordance with our Comment Policy, we encourage comments that are on topic, relevant and to-the-point. We will remove comments that include profanity, personal attacks, racial slurs, threats of violence, or other inappropriate material that violates our Terms and Conditions, and will block users who make repeated violations. We ask all readers to expect diversity of opinion and to treat one another with dignity and respect.


CloudEXPO Stories
The platform combines the strengths of Singtel's extensive, intelligent network capabilities with Microsoft's cloud expertise to create a unique solution that sets new standards for IoT applications," said Mr Diomedes Kastanis, Head of IoT at Singtel. "Our solution provides speed, transparency and flexibility, paving the way for a more pervasive use of IoT to accelerate enterprises' digitalisation efforts. AI-powered intelligent connectivity over Microsoft Azure will be the fastest connected path for IoT innovators to scale globally, and the smartest path to cross-device synergy in an instrumented, connected world.
There are many examples of disruption in consumer space – Uber disrupting the cab industry, Airbnb disrupting the hospitality industry and so on; but have you wondered who is disrupting support and operations? AISERA helps make businesses and customers successful by offering consumer-like user experience for support and operations. We have built the world’s first AI-driven IT / HR / Cloud / Customer Support and Operations solution.
ScaleMP is presenting at CloudEXPO 2019, held June 24-26 in Santa Clara, and we’d love to see you there. At the conference, we’ll demonstrate how ScaleMP is solving one of the most vexing challenges for cloud — memory cost and limit of scale — and how our innovative vSMP MemoryONE solution provides affordable larger server memory for the private and public cloud. Please visit us at Booth No. 519 to connect with our experts and learn more about vSMP MemoryONE and how it is already serving some of the world’s largest data centers. Click here to schedule a meeting with our experts and executives.
Darktrace is the world's leading AI company for cyber security. Created by mathematicians from the University of Cambridge, Darktrace's Enterprise Immune System is the first non-consumer application of machine learning to work at scale, across all network types, from physical, virtualized, and cloud, through to IoT and industrial control systems. Installed as a self-configuring cyber defense platform, Darktrace continuously learns what is ‘normal' for all devices and users, updating its understanding as the environment changes.
Codete accelerates their clients growth through technological expertise and experience. Codite team works with organizations to meet the challenges that digitalization presents. Their clients include digital start-ups as well as established enterprises in the IT industry. To stay competitive in a highly innovative IT industry, strong R&D departments and bold spin-off initiatives is a must. Codete Data Science and Software Architects teams help corporate clients to stay up to date with the modern business digitalization solutions. Achieve up to 50% early-stage technological process development cost cutdown with science and R&D-driven investment strategy with Codete's support.