Welcome!

Cloud Expo Authors: Elizabeth White, Pat Romanski, Victoria Livschitz, Carmen Gonzalez, Larry Dragich

Related Topics: Cloud Expo, SOA & WOA, Adobe Flex, Virtualization

Cloud Expo: Article

Benefits of Load Testing in the Cloud (Part 1)

How to choose the right approach

Many companies have moved applications to the cloud as a way to reduce capital expenditure while improving IT focus and effectiveness. End users see the cloud as a way to access their documents and applications remotely from anywhere and from any device. IT managers see the cloud as a means of rapidly adapting their infrastructures as needed via virtualization and a pay-per use-model. But what about load testing engineers? Can they seize the opportunities afforded by the cloud to better test the performance of web applications?

As with past overhyped trends in IT, it is important to see past all the talk and look for concrete ways to take advantage of this new technology's flexibility and scalability to save time, reduce costs, and improve the way your organization works.

This article describes how the cloud is revolutionizing load testing and the advantages it provides in many situations for ensuring your web applications perform well in production. It also covers key capabilities to look for in a load testing solution. Without the right tools in place, simply moving your testing activities to the cloud will likely not deliver the results necessary to justify the move. Understanding how to apply the right tools and practices to make the most of the cloud is fundamental to cloud-based testing and vital to ultimately going live with total peace of mind.

Benefits of Load Testing in the Cloud
Load testing with the cloud enables testing teams to take a big step forward in conducting more efficient and more realistic large-scale tests. In addition, it enables organizations to realize significant savings in cost and time made possible by cloud technology.

Perform Large-Scale Tests
More and more, today's web applications are experiencing sporadic surges in traffic. These traffic spikes can have many causes, including a new advertising campaign, an online article, a seasonal sale, and buzz on Twitter or other social media. If your application is unable to handle the increased load, you run the risk of lost business opportunities and potential damage to your brand.

Generating the load for large-scale tests to mimic these unanticipated spikes in production traffic, however, typically requires tens or even hundreds of machines. Purchasing and configuring these systems requires a significant investment of time and money. Once acquired and used for the immediate load testing need, the machines may sit unused for long stretches until they are needed for the next large-scale load testing project. With the cloud, you can rapidly set up as many load-generating machines as you need, on demand.

Perform More Realistic Tests
When testing a web application using machines inside your firewall, you're not testing the entire delivery chain. Unless all of your end users will also be within your firewall, such tests are inherently limited and may fail to reveal all performance issues.

With the cloud, you can execute load tests that access your web application as your users will - from outside of your firewall - and validate all components of the delivery chain, including the firewall, DNS, network equipment, and ISP. These tests are more realistic, and they enable you to evaluate the real-world effects of third-party components, such as content delivery networks, analytics servers, and ad servers.

Your users won't all be accessing your app from the same fixed location across the same network, so a realistic load test cannot be completed from a single location. That's why it's important to test your application and its components from different locations and geographic regions and assess its performance as network bandwidth and latency changes.

Save Time and Reduce Costs with Pay-as-you-go
When load testing with the cloud, there is no need to spend weeks setting up and configuring dozens of real machines. You can create and configure the machine image you need once and then replicate it in the cloud as many times as needed. Often, the cloud testing provider will automate this process as well, saving you even more time.

Further, the substantial up-front costs of purchasing and maintaining machines that may be used only infrequently are eliminated with the cloud. Using the pay-as-you-go model, you can rapidly set up the testing infrastructure you need, when you need it, and only for as long as you need it. From a business standpoint, the cloud lowers total cost of ownership, while increasing flexibility.

How to Choose a Cloud Testing Solution
While all cloud load testing solutions will enable you to make use of the cloud in some way, comparatively few enable you to follow all of the best practices outlined here and capitalize on the opportunities that load testing with the cloud offers. A highway lets you travel faster than a side street, but the vehicle you use makes a big difference in how quickly and how reliably you arrive at your destination. In much the same way, load testing with the cloud offers clear advantages over traditional load testing, but the tools you use are even more important to the quality of your tests.

When considering a cloud testing solution, ask the following questions:

  1. To what extent does the solution integrate with the cloud?
  2. Will the solution enable us to conduct realistic tests?
  3. Does the solution support unified tests inside and outside the firewall?
  4. Is the solution easy to use, or will we spend weeks learning and configuring it?
  5. Does the solution include full-featured reporting and decision-making modules to help our team make the most of the results?
  6. Does the solution support the technologies we used to build the application?

Integration with the Cloud Platform
If you opt for a solution that is not integrated with one or more cloud platforms, you'll need to handle several time-consuming tasks on your own. First, you'll need to learn how each platform you'll be using works, including its limitations and constraints. Second, you'll need to build, test, and maintain your own virtual machine images.

Load testing solutions that offer integration with the cloud simplify and accelerate the steps needed to use the cloud infrastructure. These solutions offer one or more of the following advantages over non-integrated alternatives:

  • Fast provisioning using preconfigured images. You can set up the infrastructure you need in minutes.
  • Simplified security. All required protections are set up by default, including firewall, certificates, and encryption.
  • Improved scalability. Leading load testing solution providers have negotiated with cloud providers to allow users of their software to employ more virtual machines (for the purpose of load testing) than are allowed by default.
  • A unified interface for multiple cloud providers. Load testing solutions can hide provisioning and billing details, so you can take maximum advantage of the cloud in a minimum of time.
  • Advanced test launching. You can save time and effort by defining and launching load generators in the cloud directly from the load testing interface.
  • Advanced results reporting. Distinct results from each geographic region involved in the test are available for analysis.

Of course, few solutions include every one of these integration capabilities. Most solutions fall somewhere on the spectrum between little or no integration and full-featured integration with multiple cloud platforms.

Realistic Tests
Although testing from the cloud is, in many cases, more realistic than testing in the lab, simply moving to the cloud is not enough to ensure the most realistic tests. Real users often have access to less bandwidth than a load generator in a cloud data center. With a slower connection, the real user will have to wait longer than the load generator to download all the data needed for a web page or application. This has two major implications:

  • Response times measured as-is from the cloud with virtually unlimited bandwidth are better than for real users. This can lead test engineers to draw the wrong conclusions, thinking that users will see an acceptable response time when in reality they will not.
  • The total number of connections established with the server will increase, because on average, connections for real users will be open longer than connections for the load generator. This can lead to a situation in which the server unexpectedly refuses additional connections under load.

When choosing a load testing solution, look for one that provides a bandwidth simulation feature that limits bandwidth to ensure that the virtual users download the content of the web application at a realistic rate. This capability is particularly important when testing mobile applications, because mobile devices typically operate with less bandwidth than laptops and desktops.

Similarly, look for a solution that can parallelize requests. Modern browsers have the ability to parallelize HTTP requests as they retrieve a web page's static resources. These parallel requests require more connections with the server and can lengthen response times. Load testing solutions that do not parallelize requests are incapable of producing truly realistic performance tests for web applications.

Unified Lab Testing and Cloud Testing
Organizations that use only lab testing or only cloud testing are at a disadvantage. So are companies that use different tools for these activities.

A solution that supports lab testing enables test engineers to begin verifying the performance of an application internally, before it's ready to be made available via the Internet. This makes it possible to find and fix performance problems earlier in the application lifecycle. Such a solution also lowers cloud costs by enabling teams to conduct internal performance tests on existing hardware when available.

More important, a single solution that supports lab testing and cloud testing enables test engineers to reuse scripts for both kinds of tests, which saves time and effort. Reusing scripts also helps pinpoint performance problems that show up in cloud testing but not in internal tests. Last, a unified solution lowers licensing and training costs, and enables test engineers to use their existing skill set for both types of load testing.

Ease of Use
Testing, with its natural position toward the end of the application lifecycle, is almost always performed under tight time constraints. Delays in the requirements or implementation phases of a project usually result in less time for the test engineers to do their jobs. The pressure is on to deliver results as quickly as possible. This environment is no place for a tool that is difficult to use and configure.

In developing and executing performance tests (either internally or via the cloud) several key features go a long way in improving test engineer productivity, including support for:

  • Easily launching the recording of a virtual user profile (preferably in one click).
  • Defining advanced behaviors (with structures such as conditions and loops) via a graphical interface, complemented by the ability to use a scripting language (JavaScript, for example) for more complex cases.
  • Automatic handling of dynamic parameters. This includes a set of correlation rules for well-known server frameworks. Ideally, the solution will dynamically detect and handle custom parameters specific to your application.
  • Sharing common script parts, such as login or logout transactions, between multiple virtual user profiles.
  • Comparing results. Sifting through results to determine the effect of a particular application or infrastructure change can be a time-consuming and arduous task without a dedicated comparison tool.

This is a not exhaustive list of usability features that can help test engineers work more efficiently; rather it should be considered as a baseline of minimum required capabilities for an efficient load testing solution.

Analysis, Monitoring, Scheduling, and Reporting
Recording a virtual user profile and playing it back to get raw results is only the beginning of an effective performance test. You need tools to help you analyze the results (in real time when possible), find the root cause of problems, and produce actionable results.

Real-time analysis enables you to detect and understand issues while the test is running. With real-time analysis, you don't have to wait for the test to finish detecting an issue, correcting it, and restarting the test. When testing in production, real-time analysis enables you abort a test if it threatens to affect the performance experienced by real users.

A comprehensive monitoring system is essential when you need to find the root causes of a problem. Predefined performance counters and threshold alerts based on industry best practices make it easy to define and analyze counters. For a nonintrusive solution that is easier to set up, look for a tool that supports agentless remote monitoring.

If your organization performs regular regression tests - and even if it doesn't - you may want to schedule performance tests and execute them automatically via the command line to complement functional testing. Regularly scheduled load tests with automatically generated reports can help organizations detect performance regression as soon as it starts to occur, which makes it easier to pinpoint and correct.

Last, reporting is a key capability and essential for communicating test results to others on the team, including management. Because reporting needs change, it is a good idea to keep your options open with a tool that supports multiple formats, including PDF, Word, HTML, and XML for integration with other systems.

Support for Web Technologies
To test Siebel applications or applications built with Adobe Flex, Microsoft Silverlight, Real-Time Messaging Protocol (RTMP), Oracle Forms, or AJAX push technologies you need a load testing tool with built-in support for the technologies you're using. Without this specialized support it can be very difficult, if not impossible, to effectively test the performance of your applications.

Similarly, the load testing solution you choose should provide support for the authentication mechanism employed by your applications, whether it is Basic, Digest, NTLM, or Kerberos. Otherwise, you won't be able to set up a virtual user profile that tests the application as a real person would use it.

Summing It Up
The cloud is opening new opportunities to improve the scale and realism of load testing while saving time and lowering costs. When selecting a cloud testing solution, keep in mind that the primary factor in your success will not be simply the move to the cloud, but rather the tool you use and how well it uses cloud technology.

More Stories By Steve Weisfeldt

Steve Weisfeldt is a Senior Performance Engineer at Neotys, a provider of load testing software for Web applications. Previously, he has worked as the President of Engine 1 Consulting, a services firm specializing in all facets of test automation. Prior to his involvement at Engine 1 Consulting, he was a Senior Systems Engineer at Aternity. Prior to that, Steve spent seven years at automated testing vendor Segue Software (acquired by Borland). While spending most of his time at Segue delivering professional services and training, he was also involved in pre-sales and product marketing efforts.

Being in the load and performance testing space since 1999, Steve has been involved in load and performance testing projects of all sizes, in industries that span the retail, financial services, insurance and manufacturing sectors. His expertise lies in enabling organizations to optimize their ability to develop, test and launch high-quality applications efficiently, on-time and on-budget. Steve graduated from the University of Massachusetts-Lowell with a BS in Electrical Engineering and an MS in Computer Engineering.

Comments (0)

Share your thoughts on this story.

Add your comment
You must be signed in to add a comment. Sign-in | Register

In accordance with our Comment Policy, we encourage comments that are on topic, relevant and to-the-point. We will remove comments that include profanity, personal attacks, racial slurs, threats of violence, or other inappropriate material that violates our Terms and Conditions, and will block users who make repeated violations. We ask all readers to expect diversity of opinion and to treat one another with dignity and respect.


@CloudExpo Stories
DevOps is all about agility. However, you don't want to be on a high-speed bus to nowhere. The right DevOps approach controls velocity with a tight feedback loop that not only consists of operational data but also incorporates business context. With a business context in the decision making, the right business priorities are incorporated, which results in a higher value creation. In his session at DevOps Summit, Todd Rader, Solutions Architect at AppDynamics, discussed key monitoring techniques...
Some developers believe that monitoring is a function of the operations team. Some operations teams firmly believe that monitoring the systems they maintain is sufficient to run the business successfully. Most of them are wrong. The complexity of today's applications have gone far and beyond the capabilities of "traditional" system-level monitoring tools and approaches and requires much broader knowledge of business and applications as a whole. The goal of DevOps is to connect all aspects of app...
15th Cloud Expo, which took place Nov. 4-6, 2014, at the Santa Clara Convention Center in Santa Clara, CA, expanded the conference content of @ThingsExpo, Big Data Expo, and DevOps Summit to include two developer events. IBM held a Bluemix Developer Playground on November 5 and ElasticBox held a Hackathon on November 6. Both events took place on the expo floor. The Bluemix Developer Playground, for developers of all levels, highlighted the ease of use of Bluemix, its services and functionalit...
The Internet of Things is tied together with a thin strand that is known as time. Coincidentally, at the core of nearly all data analytics is a timestamp. When working with time series data there are a few core principles that everyone should consider, especially across datasets where time is the common boundary. In his session at Internet of @ThingsExpo, Jim Scott, Director of Enterprise Strategy & Architecture at MapR Technologies, discussed single-value, geo-spatial, and log time series dat...
How do APIs and IoT relate? The answer is not as simple as merely adding an API on top of a dumb device, but rather about understanding the architectural patterns for implementing an IoT fabric. There are typically two or three trends: Exposing the device to a management framework Exposing that management framework to a business centric logic Exposing that business layer and data to end users. This last trend is the IoT stack, which involves a new shift in the separation of what stuff happe...
The 3rd International Internet of @ThingsExpo, co-located with the 16th International Cloud Expo - to be held June 9-11, 2015, at the Javits Center in New York City, NY - announces that its Call for Papers is now open. The Internet of Things (IoT) is the biggest idea since the creation of the Worldwide Web more than 20 years ago.
Want to enable self-service provisioning of application environments in minutes that mirror production? Can you automatically provide rich data with code-level detail back to the developers when issues occur in production? In his session at DevOps Summit, David Tesar, Microsoft Technical Evangelist on Microsoft Azure and DevOps, will discuss how to accomplish this and more utilizing technologies such as Microsoft Azure, Visual Studio online, and Application Insights in this demo-heavy session.
When an enterprise builds a hybrid IaaS cloud connecting its data center to one or more public clouds, security is often a major topic along with the other challenges involved. Security is closely intertwined with the networking choices made for the hybrid cloud. Traditional networking approaches for building a hybrid cloud try to kludge together the enterprise infrastructure with the public cloud. Consequently this approach requires risky, deep "surgery" including changes to firewalls, subnets...
The 4th International DevOps Summit, co-located with16th International Cloud Expo – being held June 9-11, 2015, at the Javits Center in New York City, NY – announces that its Call for Papers is now open. Born out of proven success in agile development, cloud computing, and process automation, DevOps is a macro trend you cannot afford to miss. From showcase success stories from early adopters and web-scale businesses, DevOps is expanding to organizations of all sizes, including the world's large...
An entirely new security model is needed for the Internet of Things, or is it? Can we save some old and tested controls for this new and different environment? In his session at @ThingsExpo, New York's at the Javits Center, Davi Ottenheimer, EMC Senior Director of Trust, reviewed hands-on lessons with IoT devices and reveal a new risk balance you might not expect. Davi Ottenheimer, EMC Senior Director of Trust, has more than nineteen years' experience managing global security operations and asse...
The Internet of Things will greatly expand the opportunities for data collection and new business models driven off of that data. In her session at @ThingsExpo, Esmeralda Swartz, CMO of MetraTech, discussed how for this to be effective you not only need to have infrastructure and operational models capable of utilizing this new phenomenon, but increasingly service providers will need to convince a skeptical public to participate. Get ready to show them the money!
SYS-CON Media announced that Centrify, a provider of unified identity management across cloud, mobile and data center environments that delivers single sign-on (SSO) for users and a simplified identity infrastructure for IT, has launched an ad campaign on Cloud Computing Journal. The ads focus on security: how an organization can successfully control privilege for all of the organization’s identities to mitigate identity-related risk without slowing down the business, and how Centrify provides ...
The Internet of Things will put IT to its ultimate test by creating infinite new opportunities to digitize products and services, generate and analyze new data to improve customer satisfaction, and discover new ways to gain a competitive advantage across nearly every industry. In order to help corporate business units to capitalize on the rapidly evolving IoT opportunities, IT must stand up to a new set of challenges. In his session at @ThingsExpo, Jeff Kaplan, Managing Director of THINKstrateg...
"We help companies that are using a lot of Software as a Service. We help companies manage and gain visibility into what people are using inside the company and decide to secure them or use standards to lock down or to embrace the adoption of SaaS inside the company," explained Scott Kriz, Co-founder and CEO of Bitium, in this SYS-CON.tv interview at 15th Cloud Expo, held Nov 4–6, 2014, at the Santa Clara Convention Center in Santa Clara, CA.
"SAP had made a big transition into the cloud as we believe it has significant value for our customers, drives innovation and is easy to consume. When you look at the SAP portfolio, SAP HANA is the underlying platform and it powers all of our platforms and all of our analytics," explained Thorsten Leiduck, VP ISVs & Digital Commerce at SAP, in this SYS-CON.tv interview at 15th Cloud Expo, held Nov 4-6, 2014, at the Santa Clara Convention Center in Santa Clara, CA.
One of the biggest challenges when developing connected devices is identifying user value and delivering it through successful user experiences. In his session at Internet of @ThingsExpo, Mike Kuniavsky, Principal Scientist, Innovation Services at PARC, described an IoT-specific approach to user experience design that combines approaches from interaction design, industrial design and service design to create experiences that go beyond simple connected gadgets to create lasting, multi-device exp...
SAP is delivering break-through innovation combined with fantastic user experience powered by the market-leading in-memory technology, SAP HANA. In his General Session at 15th Cloud Expo, Thorsten Leiduck, VP ISVs & Digital Commerce, SAP, discussed how SAP and partners provide cloud and hybrid cloud solutions as well as real-time Big Data offerings that help companies of all sizes and industries run better. SAP launched an application challenge to award the most innovative SAP HANA and SAP HANA...
Enthusiasm for the Internet of Things has reached an all-time high. In 2013 alone, venture capitalists spent more than $1 billion dollars investing in the IoT space. With "smart" appliances and devices, IoT covers wearable smart devices, cloud services to hardware companies. Nest, a Google company, detects temperatures inside homes and automatically adjusts it by tracking its user's habit. These technologies are quickly developing and with it come challenges such as bridging infrastructure gaps,...
"Cloud consumption is something we envision at Solgenia. That is trying to let the cloud spread to the user as a consumption, as utility computing. We want to allow the people to just pay for what they use, not a subscription model," explained Ermanno Bonifazi, CEO & Founder of Solgenia, in this SYS-CON.tv interview at Cloud Expo, held Nov 4–6, 2014, at the Santa Clara Convention Center in Santa Clara, CA.
The Domain Name Service (DNS) is one of the most important components in networking infrastructure, enabling users and services to access applications by translating URLs (names) into IP addresses (numbers). Because every icon and URL and all embedded content on a website requires a DNS lookup loading complex sites necessitates hundreds of DNS queries. In addition, as more internet-enabled ‘Things' get connected, people will rely on DNS to name and find their fridges, toasters and toilets. Acco...