Welcome!

@CloudExpo Authors: Ed Featherston, Yeshim Deniz, Dalibor Siroky, Liz McMillan, Rostyslav Demush

Related Topics: @CloudExpo, Microservices Expo, Microsoft Cloud

@CloudExpo: Article

Understanding Windows Azure

Part 2: A look inside the Windows Azure datacenters

To understand Windows Azure and the Azure Services Platform, it's necessary to understand how the Microsoft Datacenters work. This article provides an overview of how Microsoft Designs their datacenters and why the Generation 4 Datacenters are so revolutionary.

The Building of Datacenters
Microsoft has been building data centers for a long time. One of the best-known services Microsoft offers is Windows Update, which delivers updates as part of their content delivery network all over the world. But this is not the only product Microsoft's Datacenters are famous for. Other important products are Windows Live Messenger, Hotmail and Windows Live ID. Windows Live Messenger is one of the largest IM software and Hotmail is a frequently used e-mail software. Microsoft authorizes millions of users every day with their Live Services, which is used for Hotmail, Messenger and numerous other services. As you can see, Microsoft has experience building datacenters, but so far hasn't sold products like Windows Azure.

Microsoft's G4 - Generation 4 - Datacenters
Microsoft Research did a great job of improving their datacenters especially how they build them. Microsoft calls this the G4 - Generation 4 Datacenters. They have an industrial design - components are standardized, which lowers the cost and enables the vendors to use templates when designing their servers for Microsoft. Generation 4 Datacenters are basically built-in containers - yes, exactly those containers that we think about when we think about ship containers. There are major advantages to this design. Imagine a datacenter needs to be relocated. Microsoft would only need a couple of trucks and some property and the relocation is almost done. The main advantage to this design is that server vendors such as HP or Dell know exactly what the server racks should look like by adding them in a container. If a Datacenter needs to grow, a Generation 4 Datacenter just adds some additional containers to the existing ones. In addition, Microsoft focused on building standard tools for the cooling system so that local maintainance workers can easily get trained on the systems. It's important to note that the Generation 4 Datacenters aren't only a containerized server room. What Microsoft does with the Generation 4 Datacenters is that they improve the entire live-cycle of how the data centers are built and work. This gives Microsoft some additional benefits such as faster time-to-market and reduced costs.

How Microsoft Datacenters Help Protect the Environment
The term "Green IT" has been around for a while. Microsoft takes this term seriously and tries to minimize the energy consumption of their datacenters. For Microsoft this is not only the possibility of lowering the energy and cooling costs but also to protect our environment. With the Generation 4 Datacenters, Microsoft tries to build the containers with environmentally friendly materials and to take advantage of "ambient cooling." The last one focuses on reducing the amount of energy that needs to be invested to cool the server systems by taking advantage of the datacenter's environment. There are a couple of best practices and articles available on what Microsoft does to build environmentally friendly datacenters. I have included some links at the end of the article.

For an overview of Microsoft's Datacenter Design, this video that explains how Generation 4 Datacenters are built.

Security in Microsoft's Datacenters
Microsoft has a long tradition of building datacenters and operating systems. For decades, Microsoft had to face hackers, viruses and other malware that tried to attack their operating systems. More than other vendors, Microsoft learned from these attacks and started to build a comprehensive approach to security. The document I refer to in this article describes Microsoft's strategy for a safe Cloud Computing environment. Microsoft built an online services security and compliance team that focuses on implementing security in their applications and platforms. Microsoft's key assets for a safe and secure cloud computing environment are the commitment to trustworthy computing and the need for privacy. Microsoft works with a "privacy by default" approach.

To secure its datacenters, Microsoft holds safe datacenters certifications from various organizations such as the ISO/IEC and the British Standards Institute. Furthermore, Microsoft uses the ISO/IEC27001:2005 framework for security. This consists of the four points "Plan, Do, Check, Act."

If you want to go deeper into this Topic, I recommend you read "Securing Microsoft's Cloud Infrastructure."

What Happens with the Virtual Machines?
Figure 1 explains exactly what is going on in a Windows Azure Datacenter. I found this information in David Lemphers's blog, where he gave an overview of what happens in the datacenter. First of all, the servers are started and a maintenance OS is downloaded. This OS now talks to a service called "Fabric Controller." This service is in charge of the overall platform management and the server gets the instruction to create a host partition with a host VM. Once this is done, the server will restart and load the Host VM. The Host VM is configured to run in the datacenter and to communicate with other VMs on a safe basis. The services that we use don't run in the host VM. There's another VM, called the Guest VM, that runs within the host VM (the host VM is booted natively). Since we now have the VMRole, every guest VM holds a diff-store that will store the changes that are made to the virtual machine. The standard image is never modified. Each Host VM can contain several guest VMs.

Resources

•   •   •

This article is part of the Windows Azure Series on Cloud Computing Journal. The Series was originally posted on Codefest.at, the official Blog of the Developer and Platform Group at Microsoft Austria. You can see the original Series here.

More Stories By Mario Meir-Huber

Mario Meir-Huber studied Information Systems at the University of Linz. He worked in the IT sector for some years before founding CodeForce, an IT consulting and services company together with Andreas Aschauer. Since the advent of Cloud Computing, he has been passionate about this technology. He talks about Cloud Computing at various international events and conferences and writes for industry-leading magazines on cloud computing. He is Cloud Computing expert in various independent IT organizations and wrote a book on Cloud Computing covering all topics of the Cloud. You can follow Mario on Twitter (@mario_mh) or read his Blog at http://cloudvane.wordpress.com.

Comments (0)

Share your thoughts on this story.

Add your comment
You must be signed in to add a comment. Sign-in | Register

In accordance with our Comment Policy, we encourage comments that are on topic, relevant and to-the-point. We will remove comments that include profanity, personal attacks, racial slurs, threats of violence, or other inappropriate material that violates our Terms and Conditions, and will block users who make repeated violations. We ask all readers to expect diversity of opinion and to treat one another with dignity and respect.


@CloudExpo Stories
Blockchain. A day doesn’t seem to go by without seeing articles and discussions about the technology. According to PwC executive Seamus Cushley, approximately $1.4B has been invested in blockchain just last year. In Gartner’s recent hype cycle for emerging technologies, blockchain is approaching the peak. It is considered by Gartner as one of the ‘Key platform-enabling technologies to track.’ While there is a lot of ‘hype vs reality’ discussions going on, there is no arguing that blockchain is b...
In his keynote at 18th Cloud Expo, Andrew Keys, Co-Founder of ConsenSys Enterprise, provided an overview of the evolution of the Internet and the Database and the future of their combination – the Blockchain. Andrew Keys is Co-Founder of ConsenSys Enterprise. He comes to ConsenSys Enterprise with capital markets, technology and entrepreneurial experience. Previously, he worked for UBS investment bank in equities analysis. Later, he was responsible for the creation and distribution of life settle...
DevOps is under attack because developers don’t want to mess with infrastructure. They will happily own their code into production, but want to use platforms instead of raw automation. That’s changing the landscape that we understand as DevOps with both architecture concepts (CloudNative) and process redefinition (SRE). Rob Hirschfeld’s recent work in Kubernetes operations has led to the conclusion that containers and related platforms have changed the way we should be thinking about DevOps and...
The need for greater agility and scalability necessitated the digital transformation in the form of following equation: monolithic to microservices to serverless architecture (FaaS). To keep up with the cut-throat competition, the organisations need to update their technology stack to make software development their differentiating factor. Thus microservices architecture emerged as a potential method to provide development teams with greater flexibility and other advantages, such as the abili...
Product connectivity goes hand and hand these days with increased use of personal data. New IoT devices are becoming more personalized than ever before. In his session at 22nd Cloud Expo | DXWorld Expo, Nicolas Fierro, CEO of MIMIR Blockchain Solutions, will discuss how in order to protect your data and privacy, IoT applications need to embrace Blockchain technology for a new level of product security never before seen - or needed.
Leading companies, from the Global Fortune 500 to the smallest companies, are adopting hybrid cloud as the path to business advantage. Hybrid cloud depends on cloud services and on-premises infrastructure working in unison. Successful implementations require new levels of data mobility, enabled by an automated and seamless flow across on-premises and cloud resources. In his general session at 21st Cloud Expo, Greg Tevis, an IBM Storage Software Technical Strategist and Customer Solution Architec...
Coca-Cola’s Google powered digital signage system lays the groundwork for a more valuable connection between Coke and its customers. Digital signs pair software with high-resolution displays so that a message can be changed instantly based on what the operator wants to communicate or sell. In their Day 3 Keynote at 21st Cloud Expo, Greg Chambers, Global Group Director, Digital Innovation, Coca-Cola, and Vidya Nagarajan, a Senior Product Manager at Google, discussed how from store operations and ...
"As we've gone out into the public cloud we've seen that over time we may have lost a few things - we've lost control, we've given up cost to a certain extent, and then security, flexibility," explained Steve Conner, VP of Sales at Cloudistics,in this SYS-CON.tv interview at 20th Cloud Expo, held June 6-8, 2017, at the Javits Center in New York City, NY.
Blockchain is a shared, secure record of exchange that establishes trust, accountability and transparency across business networks. Supported by the Linux Foundation's open source, open-standards based Hyperledger Project, Blockchain has the potential to improve regulatory compliance, reduce cost as well as advance trade. Are you curious about how Blockchain is built for business? In her session at 21st Cloud Expo, René Bostic, Technical VP of the IBM Cloud Unit in North America, discussed the b...
The use of containers by developers -- and now increasingly IT operators -- has grown from infatuation to deep and abiding love. But as with any long-term affair, the honeymoon soon leads to needing to live well together ... and maybe even getting some relationship help along the way. And so it goes with container orchestration and automation solutions, which are rapidly emerging as the means to maintain the bliss between rapid container adoption and broad container use among multiple cloud host...
In his session at 21st Cloud Expo, Michael Burley, a Senior Business Development Executive in IT Services at NetApp, described how NetApp designed a three-year program of work to migrate 25PB of a major telco's enterprise data to a new STaaS platform, and then secured a long-term contract to manage and operate the platform. This significant program blended the best of NetApp’s solutions and services capabilities to enable this telco’s successful adoption of private cloud storage and launching ...
You know you need the cloud, but you’re hesitant to simply dump everything at Amazon since you know that not all workloads are suitable for cloud. You know that you want the kind of ease of use and scalability that you get with public cloud, but your applications are architected in a way that makes the public cloud a non-starter. You’re looking at private cloud solutions based on hyperconverged infrastructure, but you’re concerned with the limits inherent in those technologies.
Imagine if you will, a retail floor so densely packed with sensors that they can pick up the movements of insects scurrying across a store aisle. Or a component of a piece of factory equipment so well-instrumented that its digital twin provides resolution down to the micrometer.
"Since we launched LinuxONE we learned a lot from our customers. More than anything what they responded to were some very unique security capabilities that we have," explained Mark Figley, Director of LinuxONE Offerings at IBM, in this SYS-CON.tv interview at 21st Cloud Expo, held Oct 31 – Nov 2, 2017, at the Santa Clara Convention Center in Santa Clara, CA.
Is advanced scheduling in Kubernetes achievable?Yes, however, how do you properly accommodate every real-life scenario that a Kubernetes user might encounter? How do you leverage advanced scheduling techniques to shape and describe each scenario in easy-to-use rules and configurations? In his session at @DevOpsSummit at 21st Cloud Expo, Oleg Chunikhin, CTO at Kublr, answered these questions and demonstrated techniques for implementing advanced scheduling. For example, using spot instances and co...
A strange thing is happening along the way to the Internet of Things, namely far too many devices to work with and manage. It has become clear that we'll need much higher efficiency user experiences that can allow us to more easily and scalably work with the thousands of devices that will soon be in each of our lives. Enter the conversational interface revolution, combining bots we can literally talk with, gesture to, and even direct with our thoughts, with embedded artificial intelligence, whic...
Sanjeev Sharma Joins June 5-7, 2018 @DevOpsSummit at @Cloud Expo New York Faculty. Sanjeev Sharma is an internationally known DevOps and Cloud Transformation thought leader, technology executive, and author. Sanjeev's industry experience includes tenures as CTO, Technical Sales leader, and Cloud Architect leader. As an IBM Distinguished Engineer, Sanjeev is recognized at the highest levels of IBM's core of technical leaders.
As DevOps methodologies expand their reach across the enterprise, organizations face the daunting challenge of adapting related cloud strategies to ensure optimal alignment, from managing complexity to ensuring proper governance. How can culture, automation, legacy apps and even budget be reexamined to enable this ongoing shift within the modern software factory? In her Day 2 Keynote at @DevOpsSummit at 21st Cloud Expo, Aruna Ravichandran, VP, DevOps Solutions Marketing, CA Technologies, was jo...
While some developers care passionately about how data centers and clouds are architected, for most, it is only the end result that matters. To the majority of companies, technology exists to solve a business problem, and only delivers value when it is solving that problem. 2017 brings the mainstream adoption of containers for production workloads. In his session at 21st Cloud Expo, Ben McCormack, VP of Operations at Evernote, discussed how data centers of the future will be managed, how the p...
When shopping for a new data processing platform for IoT solutions, many development teams want to be able to test-drive options before making a choice. Yet when evaluating an IoT solution, it’s simply not feasible to do so at scale with physical devices. Building a sensor simulator is the next best choice; however, generating a realistic simulation at very high TPS with ease of configurability is a formidable challenge. When dealing with multiple application or transport protocols, you would be...