|By Mario Meir-Huber||
|February 11, 2011 08:00 AM EST||
To understand Windows Azure and the Azure Services Platform, it's necessary to understand how the Microsoft Datacenters work. This article provides an overview of how Microsoft Designs their datacenters and why the Generation 4 Datacenters are so revolutionary.
The Building of Datacenters
Microsoft has been building data centers for a long time. One of the best-known services Microsoft offers is Windows Update, which delivers updates as part of their content delivery network all over the world. But this is not the only product Microsoft's Datacenters are famous for. Other important products are Windows Live Messenger, Hotmail and Windows Live ID. Windows Live Messenger is one of the largest IM software and Hotmail is a frequently used e-mail software. Microsoft authorizes millions of users every day with their Live Services, which is used for Hotmail, Messenger and numerous other services. As you can see, Microsoft has experience building datacenters, but so far hasn't sold products like Windows Azure.
Microsoft's G4 - Generation 4 - Datacenters
Microsoft Research did a great job of improving their datacenters especially how they build them. Microsoft calls this the G4 - Generation 4 Datacenters. They have an industrial design - components are standardized, which lowers the cost and enables the vendors to use templates when designing their servers for Microsoft. Generation 4 Datacenters are basically built-in containers - yes, exactly those containers that we think about when we think about ship containers. There are major advantages to this design. Imagine a datacenter needs to be relocated. Microsoft would only need a couple of trucks and some property and the relocation is almost done. The main advantage to this design is that server vendors such as HP or Dell know exactly what the server racks should look like by adding them in a container. If a Datacenter needs to grow, a Generation 4 Datacenter just adds some additional containers to the existing ones. In addition, Microsoft focused on building standard tools for the cooling system so that local maintainance workers can easily get trained on the systems. It's important to note that the Generation 4 Datacenters aren't only a containerized server room. What Microsoft does with the Generation 4 Datacenters is that they improve the entire live-cycle of how the data centers are built and work. This gives Microsoft some additional benefits such as faster time-to-market and reduced costs.
How Microsoft Datacenters Help Protect the Environment
The term "Green IT" has been around for a while. Microsoft takes this term seriously and tries to minimize the energy consumption of their datacenters. For Microsoft this is not only the possibility of lowering the energy and cooling costs but also to protect our environment. With the Generation 4 Datacenters, Microsoft tries to build the containers with environmentally friendly materials and to take advantage of "ambient cooling." The last one focuses on reducing the amount of energy that needs to be invested to cool the server systems by taking advantage of the datacenter's environment. There are a couple of best practices and articles available on what Microsoft does to build environmentally friendly datacenters. I have included some links at the end of the article.
For an overview of Microsoft's Datacenter Design, this video that explains how Generation 4 Datacenters are built.
Security in Microsoft's Datacenters
Microsoft has a long tradition of building datacenters and operating systems. For decades, Microsoft had to face hackers, viruses and other malware that tried to attack their operating systems. More than other vendors, Microsoft learned from these attacks and started to build a comprehensive approach to security. The document I refer to in this article describes Microsoft's strategy for a safe Cloud Computing environment. Microsoft built an online services security and compliance team that focuses on implementing security in their applications and platforms. Microsoft's key assets for a safe and secure cloud computing environment are the commitment to trustworthy computing and the need for privacy. Microsoft works with a "privacy by default" approach.
To secure its datacenters, Microsoft holds safe datacenters certifications from various organizations such as the ISO/IEC and the British Standards Institute. Furthermore, Microsoft uses the ISO/IEC27001:2005 framework for security. This consists of the four points "Plan, Do, Check, Act."
If you want to go deeper into this Topic, I recommend you read "Securing Microsoft's Cloud Infrastructure."
What Happens with the Virtual Machines?
Figure 1 explains exactly what is going on in a Windows Azure Datacenter. I found this information in David Lemphers's blog, where he gave an overview of what happens in the datacenter. First of all, the servers are started and a maintenance OS is downloaded. This OS now talks to a service called "Fabric Controller." This service is in charge of the overall platform management and the server gets the instruction to create a host partition with a host VM. Once this is done, the server will restart and load the Host VM. The Host VM is configured to run in the datacenter and to communicate with other VMs on a safe basis. The services that we use don't run in the host VM. There's another VM, called the Guest VM, that runs within the host VM (the host VM is booted natively). Since we now have the VMRole, every guest VM holds a diff-store that will store the changes that are made to the virtual machine. The standard image is never modified. Each Host VM can contain several guest VMs.
- Generation 4 Datacenters
- Generation 4 Datacenters: the Architect's perspective
- Microsoft's Top 10 Best Practices for environmental friendly datacenters
- Microsoft wins green enterprise it award for sustainable Datacenters
- What happens inside the Windows Azure Datacenter?
- Securing Microsoft's Cloud Infrastructure, May 2009, Microsoft Global Foundation Services
• • •
This article is part of the Windows Azure Series on Cloud Computing Journal. The Series was originally posted on Codefest.at, the official Blog of the Developer and Platform Group at Microsoft Austria. You can see the original Series here.
Hadoop as a Service (as offered by handful of niche vendors now) is a cloud computing solution that makes medium and large-scale data processing accessible, easy, fast and inexpensive. In his session at Big Data Expo, Kumar Ramamurthy, Vice President and Chief Technologist, EIM & Big Data, at Virtusa, will discuss how this is achieved by eliminating the operational challenges of running Hadoop, so one can focus on business growth. The fragmented Hadoop distribution world and various PaaS soluti...
Mar. 3, 2015 05:30 AM EST Reads: 720
Since 2008 and for the first time in history, more than half of humans live in urban areas, urging cities to become “smart.” Today, cities can leverage the wide availability of smartphones combined with new technologies such as Beacons or NFC to connect their urban furniture and environment to create citizen-first services that improve transportation, way-finding and information delivery. In her session at @ThingsExpo, Laetitia Gazel-Anthoine, CEO of Connecthings, will focus on successful use c...
Mar. 3, 2015 04:00 AM EST Reads: 2,968
The Workspace-as-a-Service (WaaS) market will grow to $6.4B by 2018. In his session at 16th Cloud Expo, Seth Bostock, CEO of IndependenceIT, will begin by walking the audience through the evolution of Workspace as-a-Service, where it is now vs. where it going. To look beyond the desktop we must understand exactly what WaaS is, who the users are, and where it is going in the future. IT departments, ISVs and service providers must look to workflow and automation capabilities to adapt to growing ...
Mar. 3, 2015 03:00 AM EST Reads: 734
Platform-as-a-Service (PaaS) is a technology designed to make DevOps easier and allow developers to focus on application development. The PaaS takes care of provisioning, scaling, HA, and other cloud management aspects. Apache Stratos is a PaaS codebase developed in Apache and designed to create a highly productive developer environment while also supporting powerful deployment options. Integration with the Docker platform, CoreOS Linux distribution, and Kubernetes container management system ...
Mar. 2, 2015 11:45 PM EST Reads: 786
There are many considerations when moving applications from on-premise to cloud. It is critical to understand the benefits and also challenges of this migration. A successful migration will result in lower Total Cost of Ownership, yet offer the same or higher level of robustness. In his session at 15th Cloud Expo, Michael Meiner, an Engineering Director at Oracle, Corporation, will analyze a range of cloud offerings (IaaS, PaaS, SaaS) and discuss the benefits/challenges of migrating to each of...
Mar. 2, 2015 11:30 PM EST Reads: 767
Cloud data governance was previously an avoided function when cloud deployments were relatively small. With the rapid adoption in public cloud – both rogue and sanctioned, it’s not uncommon to find regulated data dumped into public cloud and unprotected. This is why enterprises and cloud providers alike need to embrace a cloud data governance function and map policies, processes and technology controls accordingly. In her session at 15th Cloud Expo, Evelyn de Souza, Data Privacy and Compliance...
Mar. 2, 2015 11:30 PM EST Reads: 776
As organizations shift toward IT-as-a-service models, the need for managing and protecting data residing across physical, virtual, and now cloud environments grows with it. CommVault can ensure protection &E-Discovery of your data – whether in a private cloud, a Service Provider delivered public cloud, or a hybrid cloud environment – across the heterogeneous enterprise. In his session at 16th Cloud Expo, Randy De Meno, Chief Technologist - Windows Products and Microsoft Partnerships, will disc...
Mar. 2, 2015 09:45 PM EST Reads: 664
Containers and microservices have become topics of intense interest throughout the cloud developer and enterprise IT communities. Accordingly, attendees at the upcoming 16th Cloud Expo at the Javits Center in New York June 9-11 will find fresh new content in a new track called PaaS | Containers & Microservices Containers are not being considered for the first time by the cloud community, but a current era of re-consideration has pushed them to the top of the cloud agenda. With the launch ...
Mar. 2, 2015 07:15 PM EST Reads: 848
VictorOps is making on-call suck less with the only collaborative alert management platform on the market. With easy on-call scheduling management, a real-time incident timeline that gives you contextual relevance around your alerts and powerful reporting features that make post-mortems more effective, VictorOps helps your IT/DevOps team solve problems faster.
Mar. 2, 2015 05:00 PM EST Reads: 1,316
Roberto Medrano, Executive Vice President at SOA Software, had reached 30,000 page views on his home page - http://RobertoMedrano.SYS-CON.com/ - on the SYS-CON family of online magazines, which includes Cloud Computing Journal, Internet of Things Journal, Big Data Journal, and SOA World Magazine. He is a recognized executive in the information technology fields of SOA, internet security, governance, and compliance. He has extensive experience with both start-ups and large companies, having been ...
Mar. 2, 2015 04:00 PM EST Reads: 1,371
HP and Aruba Networks on Monday announced a definitive agreement for HP to acquire Aruba, a provider of next-generation network access solutions for the mobile enterprise, for $24.67 per share in cash. The equity value of the transaction is approximately $3.0 billion, and net of cash and debt approximately $2.7 billion. Both companies' boards of directors have approved the deal. "Enterprises are facing a mobile-first world and are looking for solutions that help them transition legacy investme...
Mar. 2, 2015 04:00 PM EST Reads: 783
Skeuomorphism usually means retaining existing design cues in something new that doesn’t actually need them. However, the concept of skeuomorphism can be thought of as relating more broadly to applying existing patterns to new technologies that, in fact, cry out for new approaches. In his session at DevOps Summit, Gordon Haff, Senior Cloud Strategy Marketing and Evangelism Manager at Red Hat, will discuss why containers should be paired with new architectural practices such as microservices ra...
Mar. 2, 2015 04:00 PM EST Reads: 1,527
The industrial software market has treated data with the mentality of “collect everything now, worry about how to use it later.” We now find ourselves buried in data, with the pervasive connectivity of the (Industrial) Internet of Things only piling on more numbers. There’s too much data and not enough information. In his session at @ThingsExpo, Bob Gates, Global Marketing Director, GE’s Intelligent Platforms business, to discuss how realizing the power of IoT, software developers are now focu...
Mar. 2, 2015 03:15 PM EST Reads: 1,476
Operational Hadoop and the Lambda Architecture for Streaming Data Apache Hadoop is emerging as a distributed platform for handling large and fast incoming streams of data. Predictive maintenance, supply chain optimization, and Internet-of-Things analysis are examples where Hadoop provides the scalable storage, processing, and analytics platform to gain meaningful insights from granular data that is typically only valuable from a large-scale, aggregate view. One architecture useful for capturing...
Mar. 2, 2015 02:00 PM EST Reads: 1,439
SYS-CON Events announced today that Vitria Technology, Inc. will exhibit at SYS-CON’s @ThingsExpo, which will take place on June 9-11, 2015, at the Javits Center in New York City, NY. Vitria will showcase the company’s new IoT Analytics Platform through live demonstrations at booth #330. Vitria’s IoT Analytics Platform, fully integrated and powered by an operational intelligence engine, enables customers to rapidly build and operationalize advanced analytics to deliver timely business outcomes ...
Mar. 2, 2015 01:45 PM EST Reads: 1,307
DevOps is about increasing efficiency, but nothing is more inefficient than building the same application twice. However, this is a routine occurrence with enterprise applications that need both a rich desktop web interface and strong mobile support. With recent technological advances from Isomorphic Software and others, it is now feasible to create a rich desktop and tuned mobile experience with a single codebase, without compromising performance or usability.
Mar. 2, 2015 01:15 PM EST Reads: 1,224
SYS-CON Events announced today Arista Networks will exhibit at SYS-CON's DevOps Summit 2015 New York, which will take place June 9-11, 2015, at the Javits Center in New York City, NY. Arista Networks was founded to deliver software-driven cloud networking solutions for large data center and computing environments. Arista’s award-winning 10/40/100GbE switches redefine scalability, robustness, and price-performance, with over 3,000 customers and more than three million cloud networking ports depl...
Mar. 2, 2015 01:00 PM EST Reads: 1,616
The speed of software changes in growing and large scale rapid-paced DevOps environments presents a challenge for continuous testing. Many organizations struggle to get this right. Practices that work for small scale continuous testing may not be sufficient as the requirements grow. In his session at DevOps Summit, Marc Hornbeek, Sr. Solutions Architect of DevOps continuous test solutions at Spirent Communications, will explain the best practices of continuous testing at high scale, which is r...
Mar. 2, 2015 01:00 PM EST Reads: 1,302
SYS-CON Events announced today that Open Data Centers (ODC), a carrier-neutral colocation provider, will exhibit at SYS-CON's 16th International Cloud Expo®, which will take place June 9-11, 2015, at the Javits Center in New York City, NY. Open Data Centers is a carrier-neutral data center operator in New Jersey and New York City offering alternative connectivity options for carriers, service providers and enterprise customers.
Mar. 2, 2015 12:00 PM EST Reads: 1,989
Thanks to Docker, it becomes very easy to leverage containers to build, ship, and run any Linux application on any kind of infrastructure. Docker is particularly helpful for microservice architectures because their successful implementation relies on a fast, efficient deployment mechanism – which is precisely one of the features of Docker. Microservice architectures are therefore becoming more popular, and are increasingly seen as an interesting option even for smaller projects, instead of bein...
Mar. 2, 2015 12:00 PM EST Reads: 2,648