Welcome!

@CloudExpo Authors: Kevin Benedict, Elizabeth White, Xenia von Wedel, Jignesh Solanki, Pat Romanski

Related Topics: @CloudExpo, Microservices Expo, Microsoft Cloud, Open Source Cloud, Containers Expo Blog, Apache

@CloudExpo: Blog Feed Post

Crunching the Numbers in Search of a Greener Cloud

All of that hardware must be powered and cooled, and all of those offices must be lit

Although sometimes portrayed as a big computer in the sky, the reality of cloud computing is far more mundane. Clouds run on physical hardware, located in data centres, connected to one another and to their customers via high speed networks. All of that hardware must be powered and cooled, and all of those offices must be lit. Whilst many data centre operators continue to make welcome strides toward increasing the efficiency of their buildings, machines and processes, these advances remain a drop in the ocean next to the environmental implications of choices made about power source. With access to good information, might it be possible for users of the cloud to make choices that save themselves money, whilst at the same time saving (a bit of) the planet?

Greenpeace has consistently drawn attention to the importance of energy choices in evaluating the environmental credentials of data centres, with 2011′s How Dirty Is Your Data? report continuing to polarise arguments after more than a year. The most efficient modern data centres deploy an impressive arsenal of tricks to save energy (and therefore money), and to burnish their green credentials. They use the most efficient modern processors, heat offices with waste server heat, cool servers with water from the toilets and the sea, or keep air conditioning costs low by opening the building when it’s cool outside. But analysis from London’s Mastodon C suggests that these efforts, although laudable, typically trim only a few percentage points from a data centre’s environmental impact. According to Mastodon C CEO and co-founder Francine Bennett, a whopping 61% of a data centre’s environmental footprint can be attributed to choosing dirty power sources like coal. Efficient data centre design is to be welcomed, but we shouldn’t make the mistake of assuming that efficient data centres are necessarily green data centres. The corollary is also true, but if the figures are to be believed it has less serious consequences for the planet.

Dirty – and finite – power sources such as oil, coal, and gas remain the mainstay of power generation in most countries. According to figures from the Energy Information Administration in the United States, 37% of US energy consumption in 2010 was from ‘oil and other liquids,’ 21% was from coal, 9% was nuclear, 25% was gas, 1% was liquid biofuels, and only 7% was from renewables. More recent data suggests little change in the US’ spread of energy sources, although other countries are less reliant on coal. 2009 statistics (page 7) from the International Energy Agency suggest that coal accounts for 19.7% of consumption amongst OECD countries. More worryingly, although coal accounts for only 21% of consumption in the US, it has a disproportionate impact upon carbon emissions (a metric for which the US tops the table). Looking at 2010′s figures for carbon dioxide emissions directly attributable to power generation, coal’s 21% contribution to the consumption figure is responsible for 80% of the emissions total. By 2012 that has improved a little, to a mere 78%. Every small move away from coal has a large downstream effect on carbon emissions.

Energy-related carbon dioxide emissions attributable to generation of electricity

So data centres should just stop using coal then, right? That’s certainly what Greenpeace wants. But the picture is, of course, not quite that simple. Data centres require significant up-front investment, often years before the first customer pays anyone any money. Grants, incentives, and inward investment programmes may all lead data centre builders to choose otherwise odd locations for their new facilities. Data centre operators need power that is predictable, reliable, and affordable. They often simply draw most of that power from the utility grid, which will get its energy from a variety of suppliers. Offsets from planting a few trees or selling electricity generated by the windmills on your roof does nothing significant to compensate for the megawatts you’re sucking down from your closest coal-fired power station. As Amazon’s James Hamilton noted last week, data centres often want or need to be situated within easy reach of population centres. Bandwidth matters, so much so that it sometimes makes business sense to pay for cooling a data centre in a desert. Renewables such as solar, wind, and biofuels are good for carbon emissions, but can have other less welcome consequences as carbon-capturing forests and food-producing farmland are cleared to make way for solar arrays, windmills and oil palm plantations. Geothermal power is abundant, clean and almost free, but often a long way from prospective customers, and tainted by (unfair) association with geological instability. No one wants their data centre engulfed by a lava flow.

Data centres are big investments, amortised over many years. Their locations are selected for a whole host of reasons, of which the greenness of the electricity supply is only one. Some data centre providers will make much of their greenness, and may even see a business opportunity to charge a premium price that helps their customers feel good about themselves. Others say as little as possible, either because they don’t think we’ll like the truth or because (they say) no one is asking them the question.

But many users of these data centres have more room for manoeuvre. They have a choice, and maybe they just need enough information to let them exercise that choice wisely.

Some jobs will always need to be kept close, down the fattest, shortest, fastest pipe you can find. In low latency trading, for example, the speed of light presents a bottleneck. Other jobs might need to run in (or avoid) specific geographies. European data protection rules, financial and healthcare regulations in many countries, and most governments’ sensitivity about clandestine snooping on their activities are all reasons that have been used to place data in one place rather than another. A third class of jobs might need to run on one cloud rather than another. They’re optimised to utilise the features of a particular cloud provider, or they require an operating system or libraries or granular controls that only certain providers support. But even in each of these cases, there is often an element of choice. More than one data centre is easily accessible to a Wall Street trader. More than one cloud provider satisfies US/European Safe Harbor Provisions. Almost every significant cloud infrastructure provider offers mechanisms to choose one of their data centres over another. And then there’s the (far larger?) class of jobs that could run anywhere they can find a Windows or Linux virtual machine. For them, the choices are many and varied. And in a big data context, where a single job might spin up thousands of machines, those choices have real – measurable – environmental implications.

CO2 emissions vary by location… and time of day. Image © Mastodon C.

And that’s where some of the work being done by Mastodon C comes in. By gathering real data on climate (which is responsible for 20% of environmental footprint), power source (up to 61%) and server power usage, and adding educated estimates regarding efficiency initiatives inside the data centre, the company can tell you where the greenest place to run a compute job right now will be. Unseasonably cold in Singapore this week? Send your jobs to Asia. Sun visits Dublin for the day? Maybe avoid Ireland until the inevitable happens.

Cloud developers are creatures of habit. They’ll take default settings. They’ll send jobs to the same Region they used last time. And all of that means they tend to use Amazon… and they tend to use Amazon’s US-EAST region, in Virginia.

Mastodon C offers a web tool to display current figures on the CO2 emissions attributable to servers in different data centres around the world. Today, the tool shows figures for Iceland’s Greenqloud and IaaS giant Amazon, but even that offers some useful insight. As Francine Bennett notes, the vast majority (possibly 70%) of Amazon jobs run in the company’s Virginia data centre. When Virginia’s cool (which it rarely is during the summer months), this data centre’s not that bad, but when temperatures begin to rise only sun-drenched Dublin (erm…) and monsoon-gripped Singapore score more poorly on the emissions scale. Amazon’s Oregon data centre costs exactly the same as Virginia, but emissions are typically far lower. So if latency isn’t a principal concern (and it often isn’t for a big data job that’s left to get on with churning through a pile of data in an S3 bucket), and your data is already going to be processed in the United States, why not send it to green Oregon by default, instead of soot-stained Virginia?

Amazon’s most expensive facility, in Brazil, is even greener than Oregon, but the price puts a lot of potential customers off. So much so that spot prices for the site are often remarkably low. So if your compute jobs are amenable to running (and being killed from time to time) on a spot instance, Sao Paolo is also worth a look.

Greenqloud and AWS, of course, are only part of the cloud infrastructure picture. Bennett says that the company is keen to include similar data for other significant cloud providers such as Rackspace and Microsoft. Rather than predict data centre efficiency figures as they’ve done for Amazon, Bennett says they’re keen to work with the cloud providers directly, and to incorporate actual measurements from inside the data centres into the model.

Mastodon C is also about to release an API to the model behind the pretty UI, which developers (or cloud management companies like Rightscale) can then incorporate into their own code. Why couldn’t a big data job simply place itself in the greenest location at run-time?

The environment is not the only consideration in deciding where to send compute jobs. But if tools like Mastodon C’s can shine an accurate light on the financial and environmental costs of different data centres, then it seems inevitable that people will begin to pay attention. Not (immediately), perhaps, the corporate CIO in his big BMW. But the hipster founders of the next Facebook, the next Zynga, and the next Google, with their Teslas and Nests? Surely they’d be quick to embrace the means to get their computing done just as fast, just as cheaply, but greener?

Finally, there’s the subtext hidden between all the graphs and statistics that Mastodon C can show. Carbon emissions from data centres fluctuate with oil prices, the weather, and more. And those fluctuations mean that the price a data centre owner pays to run a given server for a given time fluctuates too. But, as a customer, you don’t see those price fluctuations. You pay your $0.64 to run a virtual machine in Amazon’s Virginia data centre, regardless of whether they’ve had to turn the aircon on or not. It’s 33°C there as I type, so they probably have.

At what point – if ever – would a data centre provider consider reflecting some of this variation in the actual price they charge? Would it be a transparent, fair, and honest way to pass on their true costs, or an unpredictable nightmare that would make any sort of long-term planning impossible?

You often have a choice about where you do your computing. Habit and laziness perhaps mean you don’t always exercise that choice, but maybe a visit to Mastodon C’s web dashboard will be enough to make you place your next cloud job somewhere other than the default.

What do you think? Are carbon footprints and temperature graphs and the rest something that cloud customers can and should concern themselves with? Do our small actions matter, or is it easier to just leave all of this to the people who run big data centres?

Image of Nesjavellir by Flickr user Lydur Skulason

Read the original blog entry...

More Stories By Paul Miller

Paul Miller works at the interface between the worlds of Cloud Computing and the Semantic Web, providing the insights that enable you to exploit the next wave as we approach the World Wide Database.

He blogs at www.cloudofdata.com.

@CloudExpo Stories
The cloud era has reached the stage where it is no longer a question of whether a company should migrate, but when. Enterprises have embraced the outsourcing of where their various applications are stored and who manages them, saving significant investment along the way. Plus, the cloud has become a defining competitive edge. Companies that fail to successfully adapt risk failure. The media, of course, continues to extol the virtues of the cloud, including how easy it is to get there. Migrating...
The need for greater agility and scalability necessitated the digital transformation in the form of following equation: monolithic to microservices to serverless architecture (FaaS). To keep up with the cut-throat competition, the organisations need to update their technology stack to make software development their differentiating factor. Thus microservices architecture emerged as a potential method to provide development teams with greater flexibility and other advantages, such as the abili...
In his keynote at 18th Cloud Expo, Andrew Keys, Co-Founder of ConsenSys Enterprise, provided an overview of the evolution of the Internet and the Database and the future of their combination – the Blockchain. Andrew Keys is Co-Founder of ConsenSys Enterprise. He comes to ConsenSys Enterprise with capital markets, technology and entrepreneurial experience. Previously, he worked for UBS investment bank in equities analysis. Later, he was responsible for the creation and distribution of life settle...
Product connectivity goes hand and hand these days with increased use of personal data. New IoT devices are becoming more personalized than ever before. In his session at 22nd Cloud Expo | DXWorld Expo, Nicolas Fierro, CEO of MIMIR Blockchain Solutions, will discuss how in order to protect your data and privacy, IoT applications need to embrace Blockchain technology for a new level of product security never before seen - or needed.
Leading companies, from the Global Fortune 500 to the smallest companies, are adopting hybrid cloud as the path to business advantage. Hybrid cloud depends on cloud services and on-premises infrastructure working in unison. Successful implementations require new levels of data mobility, enabled by an automated and seamless flow across on-premises and cloud resources. In his general session at 21st Cloud Expo, Greg Tevis, an IBM Storage Software Technical Strategist and Customer Solution Architec...
The use of containers by developers -- and now increasingly IT operators -- has grown from infatuation to deep and abiding love. But as with any long-term affair, the honeymoon soon leads to needing to live well together ... and maybe even getting some relationship help along the way. And so it goes with container orchestration and automation solutions, which are rapidly emerging as the means to maintain the bliss between rapid container adoption and broad container use among multiple cloud host...
Blockchain is a shared, secure record of exchange that establishes trust, accountability and transparency across business networks. Supported by the Linux Foundation's open source, open-standards based Hyperledger Project, Blockchain has the potential to improve regulatory compliance, reduce cost as well as advance trade. Are you curious about how Blockchain is built for business? In her session at 21st Cloud Expo, René Bostic, Technical VP of the IBM Cloud Unit in North America, discussed the b...
Blockchain. A day doesn’t seem to go by without seeing articles and discussions about the technology. According to PwC executive Seamus Cushley, approximately $1.4B has been invested in blockchain just last year. In Gartner’s recent hype cycle for emerging technologies, blockchain is approaching the peak. It is considered by Gartner as one of the ‘Key platform-enabling technologies to track.’ While there is a lot of ‘hype vs reality’ discussions going on, there is no arguing that blockchain is b...
Imagine if you will, a retail floor so densely packed with sensors that they can pick up the movements of insects scurrying across a store aisle. Or a component of a piece of factory equipment so well-instrumented that its digital twin provides resolution down to the micrometer.
"IBM is really all in on blockchain. We take a look at sort of the history of blockchain ledger technologies. It started out with bitcoin, Ethereum, and IBM evaluated these particular blockchain technologies and found they were anonymous and permissionless and that many companies were looking for permissioned blockchain," stated René Bostic, Technical VP of the IBM Cloud Unit in North America, in this SYS-CON.tv interview at 21st Cloud Expo, held Oct 31 – Nov 2, 2017, at the Santa Clara Conventi...
While some developers care passionately about how data centers and clouds are architected, for most, it is only the end result that matters. To the majority of companies, technology exists to solve a business problem, and only delivers value when it is solving that problem. 2017 brings the mainstream adoption of containers for production workloads. In his session at 21st Cloud Expo, Ben McCormack, VP of Operations at Evernote, discussed how data centers of the future will be managed, how the p...
As DevOps methodologies expand their reach across the enterprise, organizations face the daunting challenge of adapting related cloud strategies to ensure optimal alignment, from managing complexity to ensuring proper governance. How can culture, automation, legacy apps and even budget be reexamined to enable this ongoing shift within the modern software factory? In her Day 2 Keynote at @DevOpsSummit at 21st Cloud Expo, Aruna Ravichandran, VP, DevOps Solutions Marketing, CA Technologies, was jo...
"Since we launched LinuxONE we learned a lot from our customers. More than anything what they responded to were some very unique security capabilities that we have," explained Mark Figley, Director of LinuxONE Offerings at IBM, in this SYS-CON.tv interview at 21st Cloud Expo, held Oct 31 – Nov 2, 2017, at the Santa Clara Convention Center in Santa Clara, CA.
When shopping for a new data processing platform for IoT solutions, many development teams want to be able to test-drive options before making a choice. Yet when evaluating an IoT solution, it’s simply not feasible to do so at scale with physical devices. Building a sensor simulator is the next best choice; however, generating a realistic simulation at very high TPS with ease of configurability is a formidable challenge. When dealing with multiple application or transport protocols, you would be...
Nordstrom is transforming the way that they do business and the cloud is the key to enabling speed and hyper personalized customer experiences. In his session at 21st Cloud Expo, Ken Schow, VP of Engineering at Nordstrom, discussed some of the key learnings and common pitfalls of large enterprises moving to the cloud. This includes strategies around choosing a cloud provider(s), architecture, and lessons learned. In addition, he covered some of the best practices for structured team migration an...
In his session at 21st Cloud Expo, Raju Shreewastava, founder of Big Data Trunk, provided a fun and simple way to introduce Machine Leaning to anyone and everyone. He solved a machine learning problem and demonstrated an easy way to be able to do machine learning without even coding. Raju Shreewastava is the founder of Big Data Trunk (www.BigDataTrunk.com), a Big Data Training and consulting firm with offices in the United States. He previously led the data warehouse/business intelligence and B...
Smart cities have the potential to change our lives at so many levels for citizens: less pollution, reduced parking obstacles, better health, education and more energy savings. Real-time data streaming and the Internet of Things (IoT) possess the power to turn this vision into a reality. However, most organizations today are building their data infrastructure to focus solely on addressing immediate business needs vs. a platform capable of quickly adapting emerging technologies to address future ...
In his general session at 21st Cloud Expo, Greg Dumas, Calligo’s Vice President and G.M. of US operations, discussed the new Global Data Protection Regulation and how Calligo can help business stay compliant in digitally globalized world. Greg Dumas is Calligo's Vice President and G.M. of US operations. Calligo is an established service provider that provides an innovative platform for trusted cloud solutions. Calligo’s customers are typically most concerned about GDPR compliance, application p...
You know you need the cloud, but you’re hesitant to simply dump everything at Amazon since you know that not all workloads are suitable for cloud. You know that you want the kind of ease of use and scalability that you get with public cloud, but your applications are architected in a way that makes the public cloud a non-starter. You’re looking at private cloud solutions based on hyperconverged infrastructure, but you’re concerned with the limits inherent in those technologies.
Is advanced scheduling in Kubernetes achievable?Yes, however, how do you properly accommodate every real-life scenario that a Kubernetes user might encounter? How do you leverage advanced scheduling techniques to shape and describe each scenario in easy-to-use rules and configurations? In his session at @DevOpsSummit at 21st Cloud Expo, Oleg Chunikhin, CTO at Kublr, answered these questions and demonstrated techniques for implementing advanced scheduling. For example, using spot instances and co...