Welcome!

Cloud Expo Authors: JP Morgenthal, Craig Ferril, Michael Bushong, Peter Silva, Ed Featherston

Related Topics: Cloud Expo, Java, SOA & WOA, Virtualization, Apache, Security

Cloud Expo: Article

Understanding the Impact of Your Workload on Your Cloud Infrastructure

Deploying dynamic and scalable websites

Enterprises are quickly realizing that their future success is dependent on their ability to adapt their business to the Cloud. That realization however comes with more questions and concerns about executing an effective cloud-based strategy. The explosion of the OpenStack community has made it possible for hosting providers and businesses to create or utilize Amazon-like public and private clouds, but it's clear that the Cloud is not a one-size-fits-all solution. One prime factor that dictates the success of a cloud computing strategy is the particular workload an enterprise is tackling. From DevOps, to rapidly deploying dynamic and scalable websites, enterprises' workload needs should dictate their cloud architecture.

The specific workloads have an impact on many elements of the cloud, particularly the architecture of the infrastructure. It becomes clear how integral infrastructure architecture is to meeting workload requirements as we examine specific workload use cases.

The first element to consider in the architecture of cloud infrastructure is computing power. The number and speed of compute nodes within a cloud configuration will dictate how quickly processes can be executed. This comes into play prominently when assessing a workload, as the computing power required to develop a web app pales in comparison to the compute power required to execute Big Data analysis. Large-scale data analysis projects require powerful compute capabilities. While this kind of project is completely within the purview of well-constructed cloud architectures, that architecture must be designed as such.

The next integral ingredient to a cloud's architecture is the storage architecture. There are several different types of storage that vary in availability, resiliency and transactional performance. Amazon's Simple Storage Service (S3) provides a multi-tenant object storage environment, while block storage, like Amazon EBS, provides a persistent storage target. Typically an enterprise architecture would require a multi-level SAN architecture that provided enough IOPS (input/outputs) for the storage of the VMs as well as the transactional block storage. As flash storage has matured, it has become possible to collapse the typical storage architecture, running virtual machine operating systems and persistent transactional data on the same tier.

Another variable that's worth pointing out is that of data access speeds. While one might have a large element of storage space, the ability to quickly access the data stored within is a factor in developing infrastructure for particular workloads.

The last vector for consideration is that of density. In many datacenters, space is readily available. However, that may not always be the case. Compact and energy-efficient datacenter hardware systems take up less space in a datacenter, thereby saving space and presumably cost. However, dense hardware tends to be more expensive - making the proposition contingent upon cost per square foot versus the cost of denser hardware. One must also consider the power density per square foot as this varies widely depending on the data center. This kind of determination must be made based on ad hoc criteria and circumstances. Less dense solutions tend to also be less power-efficient, bringing an additional cost point of analysis into the picture.

Dissecting DevOps
DevOps is a term that has gained quite a bit of notoriety in recent years, as enterprises acknowledge the interdependence of IT operations and software development teams. DevOps aficionados are looking to cloud technology as a means to more closely align the two groups' respective goals, which tend to be fundamentally at odds.

The DevOps operation means creating a cloud environment that allows developers to quickly self-service launch the necessary build and test virtual machines required to create the artifacts used in a continuous delivery pipeline. This kind of pipeline requires that the main code base (often referred to as the trunk or mainline) be constantly in the "green" state and execute with no fatal errors. One of the fundamental keys to creating that pipeline is rapidly rebuilding and unit testing any changed code. Some development shops rebuild on every code check-in by every developer, while others take a less extreme approach and build every ten minutes or on the hour.

The success that can be achieved by a continuous build environment is largely dependent on a fast, well-orchestrated infrastructure. In this use-case, an IT manager will seek out a cloud architecture that launches and kills virtual machines (VMs) quickly, and includes highly accessible storage. Depending on the specifics, this could result in a cloud that combines a large amount of IOPS to quickly launch the VMs and perform the workload.

Deploying Dynamic and Scalable Websites
There's arguably no greater beneficiary of cloud computing than a company that repeatedly launches similar websites. Let's take a media company as an example that delivers entertainment content across its platform. Critical to this company's success is delivering existing and new content through rapidly changing websites. Powering this are innovative applications that provide interactive experiences that engage and create a loyal user base. For this particular type of workload, developers require automated provisioning and flexible storage and compute options, as different launches require different demands, such as a UGC contest demanding greater storage and an MMORPG video game that requires a compute-intensive environment.

These requirements often vary in their scope but are consistent in their frequency, so it is vital to eliminate the need for repetitive, time-consuming tasks such as installing and configuring commonly used website software like databases and web servers. Well-made templates can be re-used and when consistency is maintained automatically, system administrators can focus on higher-value tasks rather than performing repairs. Where other workloads may have a narrow scope, elasticity and flexibility within compute, storage and data access elements is required to effectively and efficiently deploy dynamic and scalable websites.

Approaching High Performance Computing Animation
One interesting HPC application of cloud technologies is that of animation rendering. Over the years the animation industry has used various computer hardware and software technologies to automate the steps in the production process. Because many of these steps require high-performance computing systems with significant CPU and IOPs capabilities, animation shops have often relied on purpose-built hardware and software systems for their peak capacity. With the advent of server virtualization, high-speed solid state drives (SSDs) and standards-based cloud platforms, animators are taking a closer look at the benefits of cloud technology. In order for these workloads to be efficient and effective in the cloud, high power computing must be coupled with high IOPs, as virtual machines must be launched and deprovisioned as short lived but CPU-intensive tasks.

Designing an infrastructure around a particular workload is a process that requires comprehensive understanding of the basic functions of the workload in question, and while optimizing an infrastructure for a particular workload can present some front-end hurdles, the efficiency and potential cost savings in the long run are significant, as managers can focus resources on a particularly impactful element of their architecture.

More Stories By Christopher Aedo

Christopher Aedo is senior director of technical operations at Morphlabs where he oversees the technology and operations side. He found his niche early in his career while helping a global accounting firm move their information systems from an IBM mainframe to a distributed network of Novell and SCO Unix servers. He is currently focused on making it easy for technology groups to move their infrastructure and applications from bare-metal or virtualized servers into public and private clouds.

Comments (0)

Share your thoughts on this story.

Add your comment
You must be signed in to add a comment. Sign-in | Register

In accordance with our Comment Policy, we encourage comments that are on topic, relevant and to-the-point. We will remove comments that include profanity, personal attacks, racial slurs, threats of violence, or other inappropriate material that violates our Terms and Conditions, and will block users who make repeated violations. We ask all readers to expect diversity of opinion and to treat one another with dignity and respect.


@CloudExpo Stories
SYS-CON Events announced today that Windstream, a leading provider of advanced network and cloud communications, has been named “Silver Sponsor” of SYS-CON's 16th International Cloud Expo®, which will take place on June 9–11, 2015, at the Javits Center in New York, NY. Windstream (Nasdaq: WIN), a FORTUNE 500 and S&P 500 company, is a leading provider of advanced network communications, including cloud computing and managed services, to businesses nationwide. The company also offers broadband, p...
The 4th International DevOps Summit, co-located with16th International Cloud Expo – being held June 9-11, 2015, at the Javits Center in New York City, NY – announces that its Call for Papers is now open. Born out of proven success in agile development, cloud computing, and process automation, DevOps is a macro trend you cannot afford to miss. From showcase success stories from early adopters and web-scale businesses, DevOps is expanding to organizations of all sizes, including the world's large...
Verizon Enterprise Solutions is simplifying the cloud-purchasing experience for its clients, with the launch of Verizon Cloud Marketplace, a key foundational component of the company's robust ecosystem of enterprise-class technologies. The online storefront will initially feature pre-built cloud-based services from AppDynamics, Hitachi Data Systems, Juniper Networks, PfSense and Tervela. Available globally to enterprises using Verizon Cloud, Verizon Cloud Marketplace provides a one-stop shop fo...
Leysin American School is an exclusive, private boarding school located in Leysin, Switzerland. Leysin selected an OpenStack-powered, private cloud as a service to manage multiple applications and provide development environments for students across the institution. Seeking to meet rigid data sovereignty and data integrity requirements while offering flexible, on-demand cloud resources to users, Leysin identified OpenStack as the clear choice to round out the school's cloud strategy. Additional...
The major cloud platforms defy a simple, side-by-side analysis. Each of the major IaaS public-cloud platforms offers their own unique strengths and functionality. Options for on-site private cloud are diverse as well, and must be designed and deployed while taking existing legacy architecture and infrastructure into account. Then the reality is that most enterprises are embarking on a hybrid cloud strategy and programs. In this Power Panel at 15th Cloud Expo (http://www.CloudComputingExpo.com...
We are all here because we are sold on the transformative promise of The Cloud. But what good is all of this ephemeral, on-demand infrastructure if your usage doesn't actually improve the agility and speed of your business? How must Operations adapt in order to avoid stifling your Cloud initiative? In his session at DevOps Summit, Damon Edwards, co-founder and managing partner of the DTO Solutions, will highlight the successful organizational, process, and tooling patterns of high-performing c...
The definition of IoT is not new, in fact it’s been around for over a decade. What has changed is the public's awareness that the technology we use on a daily basis has caught up on the vision of an always on, always connected world. If you look into the details of what comprises the IoT, you’ll see that it includes everything from cloud computing, Big Data analytics, “Things,” Web communication, applications, network, storage, etc. It is essentially including everything connected online from ha...
Software-driven innovation is becoming a primary approach to how businesses create and deliver new value to customers. A survey of 400 business and IT executives by the IBM Institute for Business Value showed businesses that are more effective at software delivery are also more profitable than their peers nearly 70 percent of the time (1). DevOps provides a way for businesses to remain competitive, applying lean and agile principles to software development to speed the delivery of software that ...
Docker offers a new, lightweight approach to application portability. Applications are shipped using a common container format and managed with a high-level API. Their processes run within isolated namespaces that abstract the operating environment independently of the distribution, versions, network setup, and other details of this environment. This "containerization" has often been nicknamed "the new virtualization." But containers are more than lightweight virtual machines. Beyond their small...
The move in recent years to cloud computing services and architectures has added significant pace to the application development and deployment environment. When enterprise IT can spin up large computing instances in just minutes, developers can also design and deploy in small time frames that were unimaginable a few years ago. The consequent move toward lean, agile, and fast development leads to the need for the development and operations sides to work very closely together. Thus, DevOps become...
Cloud Expo 2014 TV commercials will feature @ThingsExpo, which was launched in June, 2014 at New York City's Javits Center as the largest 'Internet of Things' event in the world.

ARMONK, N.Y., Nov. 20, 2014 /PRNewswire/ --  IBM (NYSE: IBM) today announced that it is bringing a greater level of control, security and flexibility to cloud-based application development and delivery with a single-tenant version of Bluemix, IBM's

An entirely new security model is needed for the Internet of Things, or is it? Can we save some old and tested controls for this new and different environment? In his session at @ThingsExpo, New York's at the Javits Center, Davi Ottenheimer, EMC Senior Director of Trust, reviewed hands-on lessons with IoT devices and reveal a new risk balance you might not expect. Davi Ottenheimer, EMC Senior Director of Trust, has more than nineteen years' experience managing global security operations and asse...
Explosive growth in connected devices. Enormous amounts of data for collection and analysis. Critical use of data for split-second decision making and actionable information. All three are factors in making the Internet of Things a reality. Yet, any one factor would have an IT organization pondering its infrastructure strategy. How should your organization enhance its IT framework to enable an Internet of Things implementation? In his session at Internet of @ThingsExpo, James Kirkland, Chief Ar...
Technology is enabling a new approach to collecting and using data. This approach, commonly referred to as the "Internet of Things" (IoT), enables businesses to use real-time data from all sorts of things including machines, devices and sensors to make better decisions, improve customer service, and lower the risk in the creation of new revenue opportunities. In his General Session at Internet of @ThingsExpo, Dave Wagstaff, Vice President and Chief Architect at BSQUARE Corporation, discuss the ...
The security devil is always in the details of the attack: the ones you've endured, the ones you prepare yourself to fend off, and the ones that, you fear, will catch you completely unaware and defenseless. The Internet of Things (IoT) is nothing if not an endless proliferation of details. It's the vision of a world in which continuous Internet connectivity and addressability is embedded into a growing range of human artifacts, into the natural world, and even into our smartphones, appliances, a...
"BSQUARE is in the business of selling software solutions for smart connected devices. It's obvious that IoT has moved from being a technology to being a fundamental part of business, and in the last 18 months people have said let's figure out how to do it and let's put some focus on it, " explained Dave Wagstaff, VP & Chief Architect, at BSQUARE Corporation, in this SYS-CON.tv interview at @ThingsExpo, held Nov 4-6, 2014, at the Santa Clara Convention Center in Santa Clara, CA.
"Our premise is Docker is not enough. That's not a bad thing - we actually love Docker. At ActiveState all our products are based on open source technology and Docker is an up-and-coming piece of open source technology," explained Bart Copeland, President & CEO of ActiveState Software, in this SYS-CON.tv interview at DevOps Summit at Cloud Expo®, held Nov 4-6, 2014, at the Santa Clara Convention Center in Santa Clara, CA.
DevOps Summit 2015 New York, co-located with the 16th International Cloud Expo - to be held June 9-11, 2015, at the Javits Center in New York City, NY - announces that it is now accepting Keynote Proposals. The widespread success of cloud computing is driving the DevOps revolution in enterprise IT. Now as never before, development teams must communicate and collaborate in a dynamic, 24/7/365 environment. There is no time to wait for long development cycles that produce software that is obsolete...
What do a firewall and a fortress have in common? They are no longer strong enough to protect the valuables housed inside. Like the walls of an old fortress, the cracks in the firewall are allowing the bad guys to slip in - unannounced and unnoticed. By the time these thieves get in, the damage is already done and the network is already compromised. Intellectual property is easily slipped out the back door leaving no trace of forced entry. If we want to reign in on these cybercriminals, it's hig...