Welcome!

@CloudExpo Authors: Elizabeth White, Liz McMillan, Harry Trott, Stefano Stabellini, Dana Gardner

Blog Feed Post

Windows Server 2012 – New Advanced Features

In this article I would like to share the new things in Windows Server 2012 that grabbed my particular attention. It’s not a full list of the new features, which you can find on the Microsoft official site. It’s more like a summary of the more advanced and intriguing new features.

Live migrations

Windows Server 2008 R2 supported live migration, but only if the virtual hard disk’s location remained the same, i.e. SAN. What Windows Server 2012 brings to the scene is the ability to move a virtual machine outside a cluster environment to any other Hyper-V host. You can even move several machines at the same time. The only thing you would need is a shared folder accessible from both locations and then you could move the storage (storage migration) of a virtual machine to a new one. Windows Server 2012 even offers you the ability to do a “Shared Nothing” live migration, meaning the ability to migrate a virtual machine from one host to another, even if they have no connectivity between themselves.

Minimum bandwidth

In a typical virtualization infrastructure there are multiple virtual machines sharing the same physical network card. In periods of heavy loads one virtual machine can manipulate the traffic if it needs it, leaving insufficient traffic for the rest of the machines. In Windows Server 2008 R2 you were able to set the maximum bandwidth for each virtual machine, meaning that they couldn’t occupy more of their allocated bandwidth, even if they needed to. However, it was inefficient in situations when the other virtual machines didn’t actually need the rest of the bandwidth. Setting a minimum bandwidth in Windows Server 2012 allows you to specify how much bandwidth each virtual machine needs in order to function. However, these constraints are applied only when there is a conflict in the bandwidth needs of virtual machines. If there is free bandwidth each virtual machine may use it until other virtual machines that are under their minimum bandwidth need it.
Let’s say we have a 1 Gigabit Ethernet card. We specify the minimum bandwidths for Virtual Machine (VM) 1, VM2, and VM3 to be respectively 500 Mb, 300 Mb, and 200 Mb (the sum can’t exceed the total bandwidth of the Ethernet card). In a moment of less activity from VM2 and VM3, VM1 uses 700 Mb of the available bandwidth, VM2 and VM3 use 100 Mb each. However, in the next moment a transaction is processed to VM2 and it needs all its available bandwidth. When that happens, VM2 will first occupy the available 100 Mb, but because it still needs more bandwidth and it’s under its minimum bandwidth of 300 Mb, V1 (as it exceeds its minimum bandwidth) will have to give VM2 100 Mb more.

Network virtualization

What it allows you to do is to have multiple virtual networks, possibly with the same IP address schemes, on top of the same physical network. It is really useful for cloud services providers. However, it can be used in businesses as well, for example when HR or Payroll traffic should be totally separated from the rest of the traffic. It also allows you to move virtual machines wherever you need them, despite the physical network, even to the cloud. For this to be possible each virtual machine has two different addresses for each network adapter. One is used for communication with the rest of the virtual machines and hosts in that network and is called Client Address. The other one is called Provider Address and it’s used for communications on the physical network only. These addresses are given so that different clients/departments have specific addresses so that the provider knows which traffic comes from which client/department. This way, the traffic is completely isolated from any other traffic on the physical network.

Resource metering

It allows you to easily do your capacity planning, because it collects information for the use of resources of a virtual machine throughout a period of time. Furthermore, Windows Server 2012 introduces the concept of resource pools. They combine multiple virtual machines belonging to one specific client or used for one specific function and the resource metrics are collected on a per user/function basis. This technique is helpful for IT budgeting needs and for billing customers. The metrics usually being collected are: Average CPU use (for a selected period of time); Average memory use; Minimum memory use; Maximum memory use; Maximum disk allocation; Incoming network traffic; Outgoing network traffic.

Dynamic Host Configuration Protocol (DHCP)

A rogue DHCP server is a fake server which is connected to the network, collects DHCP server requests and responds with incorrect addressing information. Active Directory protects its DHCP service by not allowing other DHCP servers to operate on the network until they are authenticated. However, this does not apply for non-Microsoft-Windows DHCP servers. They could still connect to the network and give addresses. Windows Server 2012 limits this by allowing you to specify which ports can have DHCP servers attached. So if the intruder is attached to any other port, its DHCP fake server packets would be dropped.

Snapshots

They are mainly used when you need point-in-time recovery in case of an error. For example, when you apply a service pack on a production server, you may want to give yourself a backdoor in case something bad happens. You take the snapshot before the service pack installation and if needed, you recover the server with it. What happens if you don’t need it? You’ve monitored the server for a while and everything seems normal after the patch? You don’t need the screenshot anymore. However, in Windows Server 2008 R2 you couldn’t just get rid of it. You would have to pause the virtual machine for a while, making it inaccessible. Windows Server 2012 has a new feature, called Hyper-V Live Merge, which allows you to release the snapshot while the machine continues to run.

Stay tuned to Monitis for our future articles on Windows Server 2012. We will take a deeper look into these and some more advanced new features.

Share Now:del.icio.usDiggFacebookLinkedInBlinkListDZoneGoogle BookmarksRedditStumbleUponTwitterRSS

Read the original blog entry...

More Stories By Hovhannes Avoyan

Hovhannes Avoyan is the CEO of PicsArt, Inc.,

@CloudExpo Stories
As organizations shift towards IT-as-a-service models, the need for managing and protecting data residing across physical, virtual, and now cloud environments grows with it. Commvault can ensure protection, access and E-Discovery of your data – whether in a private cloud, a Service Provider delivered public cloud, or a hybrid cloud environment – across the heterogeneous enterprise. In his general session at 18th Cloud Expo, Randy De Meno, Chief Technologist - Windows Products and Microsoft Part...
IoT offers a value of almost $4 trillion to the manufacturing industry through platforms that can improve margins, optimize operations & drive high performance work teams. By using IoT technologies as a foundation, manufacturing customers are integrating worker safety with manufacturing systems, driving deep collaboration and utilizing analytics to exponentially increased per-unit margins. However, as Benoit Lheureux, the VP for Research at Gartner points out, “IoT project implementers often ...
Presidio has received the 2015 EMC Partner Services Quality Award from EMC Corporation for achieving outstanding service excellence and customer satisfaction as measured by the EMC Partner Services Quality (PSQ) program. Presidio was also honored as the 2015 EMC Americas Marketing Excellence Partner of the Year and 2015 Mid-Market East Partner of the Year. The EMC PSQ program is a project-specific survey program designed for partners with Service Partner designations to solicit customer feedbac...
Edge Hosting has announced a partnership with and the availability of CloudFlare, a web application firewall, CDN and DDoS mitigation service. “This partnership enhances Edge Hosting’s world class, perimeter layer, application (layer 7) defensive mechanism,” said Mark Houpt, Edge Hosting CISO. “The goal was to enable a new layer of customer controlled defense and compliance through the application of DDoS filters and mitigations, the web application firewall (WAF) feature and the added benefit ...
Machine Learning helps make complex systems more efficient. By applying advanced Machine Learning techniques such as Cognitive Fingerprinting, wind project operators can utilize these tools to learn from collected data, detect regular patterns, and optimize their own operations. In his session at 18th Cloud Expo, Stuart Gillen, Director of Business Development at SparkCognition, discussed how research has demonstrated the value of Machine Learning in delivering next generation analytics to imp...
In his general session at 18th Cloud Expo, Lee Atchison, Principal Cloud Architect and Advocate at New Relic, discussed cloud as a ‘better data center’ and how it adds new capacity (faster) and improves application availability (redundancy). The cloud is a ‘Dynamic Tool for Dynamic Apps’ and resource allocation is an integral part of your application architecture, so use only the resources you need and allocate /de-allocate resources on the fly.
It is one thing to build single industrial IoT applications, but what will it take to build the Smart Cities and truly society changing applications of the future? The technology won’t be the problem, it will be the number of parties that need to work together and be aligned in their motivation to succeed. In his Day 2 Keynote at @ThingsExpo, Henrik Kenani Dahlgren, Portfolio Marketing Manager at Ericsson, discussed how to plan to cooperate, partner, and form lasting all-star teams to change t...
Digital Initiatives create new ways of conducting business, which drive the need for increasingly advanced security and regulatory compliance challenges with exponentially more damaging consequences. In the BMC and Forbes Insights Survey in 2016, 97% of executives said they expect a rise in data breach attempts in the next 12 months. Sixty percent said operations and security teams have only a general understanding of each other’s requirements, resulting in a “SecOps gap” leaving organizations u...
In his keynote at 18th Cloud Expo, Andrew Keys, Co-Founder of ConsenSys Enterprise, provided an overview of the evolution of the Internet and the Database and the future of their combination – the Blockchain. Andrew Keys is Co-Founder of ConsenSys Enterprise. He comes to ConsenSys Enterprise with capital markets, technology and entrepreneurial experience. Previously, he worked for UBS investment bank in equities analysis. Later, he was responsible for the creation and distribution of life sett...
Basho Technologies has announced the latest release of Basho Riak TS, version 1.3. Riak TS is an enterprise-grade NoSQL database optimized for Internet of Things (IoT). The open source version enables developers to download the software for free and use it in production as well as make contributions to the code and develop applications around Riak TS. Enhancements to Riak TS make it quick, easy and cost-effective to spin up an instance to test new ideas and build IoT applications. In addition to...
There are several IoTs: the Industrial Internet, Consumer Wearables, Wearables and Healthcare, Supply Chains, and the movement toward Smart Grids, Cities, Regions, and Nations. There are competing communications standards every step of the way, a bewildering array of sensors and devices, and an entire world of competing data analytics platforms. To some this appears to be chaos. In this power panel at @ThingsExpo, moderated by Conference Chair Roger Strukhoff, Bradley Holt, Developer Advocate a...
You are moving to the Cloud. The question is not if, it’s when. Now that your competitors are in the cloud and lapping you, your “when” better hurry up and get here. But saying and doing are two different things. In his session at @DevOpsSummit at 18th Cloud Expo, Robert Reeves, CTO of Datical, explained how DevOps can be your onramp to the cloud. By adopting simple, platform independent DevOps strategies, you can accelerate your move to the cloud. Spoiler Alert: He also makes sure you don’t...
Creating replica copies to tolerate a certain number of failures is easy, but very expensive at cloud-scale. Conventional RAID has lower overhead, but it is limited in the number of failures it can tolerate. And the management is like herding cats (overseeing capacity, rebuilds, migrations, and degraded performance). Download Slide Deck: ▸ Here In his general session at 18th Cloud Expo, Scott Cleland, Senior Director of Product Marketing for the HGST Cloud Infrastructure Business Unit, discusse...
University of Colorado Athletics has selected FORTRUST, Colorado’s only Tier III Gold certified data center, as their official data center and colocation services provider, FORTRUST announced today. A nationally recognized and prominent collegiate athletics program, CU provides a high quality and comprehensive student-athlete experience. The program sponsors 17 varsity teams and in their history, the Colorado Buffaloes have collected an impressive 28 national championships. Maintaining uptime...
Connected devices and the industrial internet are growing exponentially every year with Cisco expecting 50 billion devices to be in operation by 2020. In this period of growth, location-based insights are becoming invaluable to many businesses as they adopt new connected technologies. Knowing when and where these devices connect from is critical for a number of scenarios in supply chain management, disaster management, emergency response, M2M, location marketing and more. In his session at @Th...
The cloud market growth today is largely in public clouds. While there is a lot of spend in IT departments in virtualization, these aren’t yet translating into a true “cloud” experience within the enterprise. What is stopping the growth of the “private cloud” market? In his general session at 18th Cloud Expo, Nara Rajagopalan, CEO of Accelerite, explored the challenges in deploying, managing, and getting adoption for a private cloud within an enterprise. What are the key differences between wh...
"A lot of times people will come to us and have a very diverse set of requirements or very customized need and we'll help them to implement it in a fashion that you can't just buy off of the shelf," explained Nick Rose, CTO of Enzu, in this SYS-CON.tv interview at 18th Cloud Expo, held June 7-9, 2016, at the Javits Center in New York City, NY.
More and more companies are looking to microservices as an architectural pattern for breaking apart applications into more manageable pieces so that agile teams can deliver new features quicker and more effectively. What this pattern has done more than anything to date is spark organizational transformations, setting the foundation for future application development. In practice, however, there are a number of considerations to make that go beyond simply “build, ship, and run,” which changes ho...
The pace of innovation, vendor lock-in, production sustainability, cost-effectiveness, and managing risk… In his session at 18th Cloud Expo, Dan Choquette, Founder of RackN, discussed how CIOs are challenged finding the balance of finding the right tools, technology and operational model that serves the business the best. He also discussed how clouds, open source software and infrastructure solutions have benefits but also drawbacks and how workload and operational portability between vendors ...
The initial debate is over: Any enterprise with a serious commitment to IT is migrating to the cloud. But things are not so simple. There is a complex mix of on-premises, colocated, and public-cloud deployments. In this power panel at 18th Cloud Expo, moderated by Conference Chair Roger Strukhoff, Randy De Meno, Chief Technologist - Windows Products and Microsoft Partnerships at Commvault; Dave Landa, Chief Operating Officer at kintone; William Morrish, General Manager Product Sales at Interou...