Welcome!

Cloud Expo Authors: Pat Romanski, Roger Strukhoff, Jason Bloomberg, Hovhannes Avoyan, Brian Lavallée

Related Topics: Cloud Expo, SOA & WOA, .NET

Cloud Expo: Article

Understanding Windows Azure

Part 2: A look inside the Windows Azure datacenters

To understand Windows Azure and the Azure Services Platform, it's necessary to understand how the Microsoft Datacenters work. This article provides an overview of how Microsoft Designs their datacenters and why the Generation 4 Datacenters are so revolutionary.

The Building of Datacenters
Microsoft has been building data centers for a long time. One of the best-known services Microsoft offers is Windows Update, which delivers updates as part of their content delivery network all over the world. But this is not the only product Microsoft's Datacenters are famous for. Other important products are Windows Live Messenger, Hotmail and Windows Live ID. Windows Live Messenger is one of the largest IM software and Hotmail is a frequently used e-mail software. Microsoft authorizes millions of users every day with their Live Services, which is used for Hotmail, Messenger and numerous other services. As you can see, Microsoft has experience building datacenters, but so far hasn't sold products like Windows Azure.

Microsoft's G4 - Generation 4 - Datacenters
Microsoft Research did a great job of improving their datacenters especially how they build them. Microsoft calls this the G4 - Generation 4 Datacenters. They have an industrial design - components are standardized, which lowers the cost and enables the vendors to use templates when designing their servers for Microsoft. Generation 4 Datacenters are basically built-in containers - yes, exactly those containers that we think about when we think about ship containers. There are major advantages to this design. Imagine a datacenter needs to be relocated. Microsoft would only need a couple of trucks and some property and the relocation is almost done. The main advantage to this design is that server vendors such as HP or Dell know exactly what the server racks should look like by adding them in a container. If a Datacenter needs to grow, a Generation 4 Datacenter just adds some additional containers to the existing ones. In addition, Microsoft focused on building standard tools for the cooling system so that local maintainance workers can easily get trained on the systems. It's important to note that the Generation 4 Datacenters aren't only a containerized server room. What Microsoft does with the Generation 4 Datacenters is that they improve the entire live-cycle of how the data centers are built and work. This gives Microsoft some additional benefits such as faster time-to-market and reduced costs.

How Microsoft Datacenters Help Protect the Environment
The term "Green IT" has been around for a while. Microsoft takes this term seriously and tries to minimize the energy consumption of their datacenters. For Microsoft this is not only the possibility of lowering the energy and cooling costs but also to protect our environment. With the Generation 4 Datacenters, Microsoft tries to build the containers with environmentally friendly materials and to take advantage of "ambient cooling." The last one focuses on reducing the amount of energy that needs to be invested to cool the server systems by taking advantage of the datacenter's environment. There are a couple of best practices and articles available on what Microsoft does to build environmentally friendly datacenters. I have included some links at the end of the article.

For an overview of Microsoft's Datacenter Design, this video that explains how Generation 4 Datacenters are built.

Security in Microsoft's Datacenters
Microsoft has a long tradition of building datacenters and operating systems. For decades, Microsoft had to face hackers, viruses and other malware that tried to attack their operating systems. More than other vendors, Microsoft learned from these attacks and started to build a comprehensive approach to security. The document I refer to in this article describes Microsoft's strategy for a safe Cloud Computing environment. Microsoft built an online services security and compliance team that focuses on implementing security in their applications and platforms. Microsoft's key assets for a safe and secure cloud computing environment are the commitment to trustworthy computing and the need for privacy. Microsoft works with a "privacy by default" approach.

To secure its datacenters, Microsoft holds safe datacenters certifications from various organizations such as the ISO/IEC and the British Standards Institute. Furthermore, Microsoft uses the ISO/IEC27001:2005 framework for security. This consists of the four points "Plan, Do, Check, Act."

If you want to go deeper into this Topic, I recommend you read "Securing Microsoft's Cloud Infrastructure."

What Happens with the Virtual Machines?
Figure 1 explains exactly what is going on in a Windows Azure Datacenter. I found this information in David Lemphers's blog, where he gave an overview of what happens in the datacenter. First of all, the servers are started and a maintenance OS is downloaded. This OS now talks to a service called "Fabric Controller." This service is in charge of the overall platform management and the server gets the instruction to create a host partition with a host VM. Once this is done, the server will restart and load the Host VM. The Host VM is configured to run in the datacenter and to communicate with other VMs on a safe basis. The services that we use don't run in the host VM. There's another VM, called the Guest VM, that runs within the host VM (the host VM is booted natively). Since we now have the VMRole, every guest VM holds a diff-store that will store the changes that are made to the virtual machine. The standard image is never modified. Each Host VM can contain several guest VMs.

Resources

•   •   •

This article is part of the Windows Azure Series on Cloud Computing Journal. The Series was originally posted on Codefest.at, the official Blog of the Developer and Platform Group at Microsoft Austria. You can see the original Series here.

More Stories By Mario Meir-Huber

Mario Meir-Huber studied Information Systems at the University of Linz. He worked in the IT sector for some years before founding CodeForce, an IT consulting and services company together with Andreas Aschauer. Since the advent of Cloud Computing, he has been passionate about this technology. He talks about Cloud Computing at various international events and conferences and writes for industry-leading magazines on cloud computing. He is Cloud Computing expert in various independent IT organizations and wrote a book on Cloud Computing covering all topics of the Cloud. You can follow Mario on Twitter (@mario_mh) or read his Blog at http://cloudvane.wordpress.com.

Comments (0)

Share your thoughts on this story.

Add your comment
You must be signed in to add a comment. Sign-in | Register

In accordance with our Comment Policy, we encourage comments that are on topic, relevant and to-the-point. We will remove comments that include profanity, personal attacks, racial slurs, threats of violence, or other inappropriate material that violates our Terms and Conditions, and will block users who make repeated violations. We ask all readers to expect diversity of opinion and to treat one another with dignity and respect.


Cloud Expo Latest Stories
Hardware will never be more valuable than on the day it hits your loading dock. Each day new servers are not deployed to production the business is losing money. While Moore’s Law is typically cited to explain the exponential density growth of chips, a critical consequence of this is rapid depreciation of servers. The hardware for clustered systems (e.g., Hadoop, OpenStack) tends to be significant capital expenses. In his session at 15th Cloud Expo, Mason Katz, CTO and co-founder of StackIQ, to discuss how infrastructure teams should be aware of the capitalization and depreciation model of these expenses to fully understand when and where automation is critical.
Over the last few years the healthcare ecosystem has revolved around innovations in Electronic Health Record (HER) based systems. This evolution has helped us achieve much desired interoperability. Now the focus is shifting to other equally important aspects – scalability and performance. While applying cloud computing environments to the EHR systems, a special consideration needs to be given to the cloud enablement of Veterans Health Information Systems and Technology Architecture (VistA), i.e., the largest single medical system in the United States.
In his session at 15th Cloud Expo, Mark Hinkle, Senior Director, Open Source Solutions at Citrix Systems Inc., will provide overview of the open source software that can be used to deploy and manage a cloud computing environment. He will include information on storage, networking(e.g., OpenDaylight) and compute virtualization (Xen, KVM, LXC) and the orchestration(Apache CloudStack, OpenStack) of the three to build their own cloud services. Speaker Bio: Mark Hinkle is the Senior Director, Open Source Solutions, at Citrix Systems Inc. He joined Citrix as a result of their July 2011 acquisition of Cloud.com where he was their Vice President of Community. He is currently responsible for Citrix open source efforts around the open source cloud computing platform, Apache CloudStack and the Xen Hypervisor. Previously he was the VP of Community at Zenoss Inc., a producer of the open source application, server, and network management software, where he grew the Zenoss Core project to over 10...
Most of today’s hardware manufacturers are building servers with at least one SATA Port, but not every systems engineer utilizes them. This is considered a loss in the game of maximizing potential storage space in a fixed unit. The SATADOM Series was created by Innodisk as a high-performance, small form factor boot drive with low power consumption to be plugged into the unused SATA port on your server board as an alternative to hard drive or USB boot-up. Built for 1U systems, this powerful device is smaller than a one dollar coin, and frees up otherwise dead space on your motherboard. To meet the requirements of tomorrow’s cloud hardware, Innodisk invested internal R&D resources to develop our SATA III series of products. The SATA III SATADOM boasts 500/180MBs R/W Speeds respectively, or double R/W Speed of SATA II products.
14th International Cloud Expo, held on June 10–12, 2014 at the Javits Center in New York City, featured three content-packed days with a rich array of sessions about the business and technical value of cloud computing, Internet of Things, Big Data, and DevOps led by exceptional speakers from every sector of the IT ecosystem. The Cloud Expo series is the fastest-growing Enterprise IT event in the past 10 years, devoted to every aspect of delivering massively scalable enterprise IT as a service.
As more applications and services move "to the cloud" (public or on-premise) cloud environments are increasingly adopting and building out traditional enterprise features. This in turn is enabling and encouraging cloud adoption from enterprise users. In many ways the definition is blurring as features like continuous operation, geo-distribution or on-demand capacity become the norm. NuoDB is involved in both building enterprise software and using enterprise cloud capabilities. In his session at 15th Cloud Expo, Seth Proctor, CTO at NuoDB, Inc., will discuss the experiences from building, deploying and using enterprise services and suggest some ways to approach moving enterprise applications into a cloud model.
Until recently, many organizations required specialized departments to perform mapping and geospatial analysis, and they used Esri on-premise solutions for that work. In his session at 15th Cloud Expo, Dave Peters, author of the Esri Press book Building a GIS, System Architecture Design Strategies for Managers, will discuss how Esri has successfully included the cloud as a fully integrated SaaS expansion of the ArcGIS mapping platform. Organizations that have incorporated Esri cloud-based applications and content within their business models are reaping huge benefits by directly leveraging cloud-based mapping and analysis capabilities within their existing enterprise investments. The ArcGIS mapping platform includes cloud-based content management and information resources to more widely, efficiently, and affordably deliver real-time actionable information and analysis capabilities to your organization.
Almost everyone sees the potential of Internet of Things but how can businesses truly unlock that potential. The key will be in the ability to discover business insight in the midst of an ocean of Big Data generated from billions of embedded devices via Systems of Discover. Businesses will also need to ensure that they can sustain that insight by leveraging the cloud for global reach, scale and elasticity. In his session at Internet of @ThingsExpo, Mac Devine, Distinguished Engineer at IBM, will discuss bringing these three elements together via Systems of Discover.
Cloud and Big Data present unique dilemmas: embracing the benefits of these new technologies while maintaining the security of your organization’s assets. When an outside party owns, controls and manages your infrastructure and computational resources, how can you be assured that sensitive data remains private and secure? How do you best protect data in mixed use cloud and big data infrastructure sets? Can you still satisfy the full range of reporting, compliance and regulatory requirements? In his session at 15th Cloud Expo, Derek Tumulak, Vice President of Product Management at Vormetric, will discuss how to address data security in cloud and Big Data environments so that your organization isn’t next week’s data breach headline.
The cloud is everywhere and growing, and with it SaaS has become an accepted means for software delivery. SaaS is more than just a technology, it is a thriving business model estimated to be worth around $53 billion dollars by 2015, according to IDC. The question is – how do you build and scale a profitable SaaS business model? In his session at 15th Cloud Expo, Jason Cumberland, Vice President, SaaS Solutions at Dimension Data, will give the audience an understanding of common mistakes businesses make when transitioning to SaaS; how to avoid them; and how to build a profitable and scalable SaaS business.
SYS-CON Events announced today that Gridstore™, the leader in software-defined storage (SDS) purpose-built for Windows Servers and Hyper-V, will exhibit at SYS-CON's 15th International Cloud Expo®, which will take place on November 4–6, 2014, at the Santa Clara Convention Center in Santa Clara, CA. Gridstore™ is the leader in software-defined storage purpose built for virtualization that is designed to accelerate applications in virtualized environments. Using its patented Server-Side Virtual Controller™ Technology (SVCT) to eliminate the I/O blender effect and accelerate applications Gridstore delivers vmOptimized™ Storage that self-optimizes to each application or VM across both virtual and physical environments. Leveraging a grid architecture, Gridstore delivers the first end-to-end storage QoS to ensure the most important App or VM performance is never compromised. The storage grid, that uses Gridstore’s performance optimized nodes or capacity optimized nodes, starts with as few a...
SYS-CON Events announced today that Solgenia, the global market leader in Cloud Collaboration and Cloud Infrastructure software solutions, will exhibit at SYS-CON's 15th International Cloud Expo®, which will take place on November 4–6, 2014, at the Santa Clara Convention Center in Santa Clara, CA. Solgenia is the global market leader in Cloud Collaboration and Cloud Infrastructure software solutions. Designed to “Bridge the Gap” between personal and professional social, mobile and cloud user experiences, our solutions help large and medium-sized organizations dramatically improve productivity, reduce collaboration costs, and increase the overall enterprise value by bringing collaboration and infrastructure solutions to the cloud.
Cloud computing started a technology revolution; now DevOps is driving that revolution forward. By enabling new approaches to service delivery, cloud and DevOps together are delivering even greater speed, agility, and efficiency. No wonder leading innovators are adopting DevOps and cloud together! In his session at DevOps Summit, Andi Mann, Vice President of Strategic Solutions at CA Technologies, will explore the synergies in these two approaches, with practical tips, techniques, research data, war stories, case studies, and recommendations.
Enterprises require the performance, agility and on-demand access of the public cloud, and the management, security and compatibility of the private cloud. The solution? In his session at 15th Cloud Expo, Simone Brunozzi, VP and Chief Technologist(global role) for VMware, will explore how to unlock the power of the hybrid cloud and the steps to get there. He'll discuss the challenges that conventional approaches to both public and private cloud computing, and outline the tough decisions that must be made to accelerate the journey to the hybrid cloud. As part of the transition, an Infrastructure-as-a-Service model will enable enterprise IT to build services beyond their data center while owning what gets moved, when to move it, and for how long. IT can then move forward on what matters most to the organization that it supports – availability, agility and efficiency.
Every healthy ecosystem is diverse. This is especially true in cloud ecosystems, where portability and interoperability are more important than old enterprise models of proprietary ownership. In his session at 15th Cloud Expo, Mark Baker, Server Product Manager at Canonical/Ubuntu, will discuss how single vendors used to take the lead in creating and delivering technology, but in a cloud economy, where users want tools of their preference, when and where they need them, it makes no sense.