Welcome!

@CloudExpo Authors: JP Morgenthal, Pat Romanski, Jim Hansen, Liz McMillan, Elizabeth White

Related Topics: @CloudExpo, Microservices Expo, Adobe Flex, Containers Expo Blog

@CloudExpo: Article

Benefits of Load Testing in the Cloud (Part 1)

How to choose the right approach

Many companies have moved applications to the cloud as a way to reduce capital expenditure while improving IT focus and effectiveness. End users see the cloud as a way to access their documents and applications remotely from anywhere and from any device. IT managers see the cloud as a means of rapidly adapting their infrastructures as needed via virtualization and a pay-per use-model. But what about load testing engineers? Can they seize the opportunities afforded by the cloud to better test the performance of web applications?

As with past overhyped trends in IT, it is important to see past all the talk and look for concrete ways to take advantage of this new technology's flexibility and scalability to save time, reduce costs, and improve the way your organization works.

This article describes how the cloud is revolutionizing load testing and the advantages it provides in many situations for ensuring your web applications perform well in production. It also covers key capabilities to look for in a load testing solution. Without the right tools in place, simply moving your testing activities to the cloud will likely not deliver the results necessary to justify the move. Understanding how to apply the right tools and practices to make the most of the cloud is fundamental to cloud-based testing and vital to ultimately going live with total peace of mind.

Benefits of Load Testing in the Cloud
Load testing with the cloud enables testing teams to take a big step forward in conducting more efficient and more realistic large-scale tests. In addition, it enables organizations to realize significant savings in cost and time made possible by cloud technology.

Perform Large-Scale Tests
More and more, today's web applications are experiencing sporadic surges in traffic. These traffic spikes can have many causes, including a new advertising campaign, an online article, a seasonal sale, and buzz on Twitter or other social media. If your application is unable to handle the increased load, you run the risk of lost business opportunities and potential damage to your brand.

Generating the load for large-scale tests to mimic these unanticipated spikes in production traffic, however, typically requires tens or even hundreds of machines. Purchasing and configuring these systems requires a significant investment of time and money. Once acquired and used for the immediate load testing need, the machines may sit unused for long stretches until they are needed for the next large-scale load testing project. With the cloud, you can rapidly set up as many load-generating machines as you need, on demand.

Perform More Realistic Tests
When testing a web application using machines inside your firewall, you're not testing the entire delivery chain. Unless all of your end users will also be within your firewall, such tests are inherently limited and may fail to reveal all performance issues.

With the cloud, you can execute load tests that access your web application as your users will - from outside of your firewall - and validate all components of the delivery chain, including the firewall, DNS, network equipment, and ISP. These tests are more realistic, and they enable you to evaluate the real-world effects of third-party components, such as content delivery networks, analytics servers, and ad servers.

Your users won't all be accessing your app from the same fixed location across the same network, so a realistic load test cannot be completed from a single location. That's why it's important to test your application and its components from different locations and geographic regions and assess its performance as network bandwidth and latency changes.

Save Time and Reduce Costs with Pay-as-you-go
When load testing with the cloud, there is no need to spend weeks setting up and configuring dozens of real machines. You can create and configure the machine image you need once and then replicate it in the cloud as many times as needed. Often, the cloud testing provider will automate this process as well, saving you even more time.

Further, the substantial up-front costs of purchasing and maintaining machines that may be used only infrequently are eliminated with the cloud. Using the pay-as-you-go model, you can rapidly set up the testing infrastructure you need, when you need it, and only for as long as you need it. From a business standpoint, the cloud lowers total cost of ownership, while increasing flexibility.

How to Choose a Cloud Testing Solution
While all cloud load testing solutions will enable you to make use of the cloud in some way, comparatively few enable you to follow all of the best practices outlined here and capitalize on the opportunities that load testing with the cloud offers. A highway lets you travel faster than a side street, but the vehicle you use makes a big difference in how quickly and how reliably you arrive at your destination. In much the same way, load testing with the cloud offers clear advantages over traditional load testing, but the tools you use are even more important to the quality of your tests.

When considering a cloud testing solution, ask the following questions:

  1. To what extent does the solution integrate with the cloud?
  2. Will the solution enable us to conduct realistic tests?
  3. Does the solution support unified tests inside and outside the firewall?
  4. Is the solution easy to use, or will we spend weeks learning and configuring it?
  5. Does the solution include full-featured reporting and decision-making modules to help our team make the most of the results?
  6. Does the solution support the technologies we used to build the application?

Integration with the Cloud Platform
If you opt for a solution that is not integrated with one or more cloud platforms, you'll need to handle several time-consuming tasks on your own. First, you'll need to learn how each platform you'll be using works, including its limitations and constraints. Second, you'll need to build, test, and maintain your own virtual machine images.

Load testing solutions that offer integration with the cloud simplify and accelerate the steps needed to use the cloud infrastructure. These solutions offer one or more of the following advantages over non-integrated alternatives:

  • Fast provisioning using preconfigured images. You can set up the infrastructure you need in minutes.
  • Simplified security. All required protections are set up by default, including firewall, certificates, and encryption.
  • Improved scalability. Leading load testing solution providers have negotiated with cloud providers to allow users of their software to employ more virtual machines (for the purpose of load testing) than are allowed by default.
  • A unified interface for multiple cloud providers. Load testing solutions can hide provisioning and billing details, so you can take maximum advantage of the cloud in a minimum of time.
  • Advanced test launching. You can save time and effort by defining and launching load generators in the cloud directly from the load testing interface.
  • Advanced results reporting. Distinct results from each geographic region involved in the test are available for analysis.

Of course, few solutions include every one of these integration capabilities. Most solutions fall somewhere on the spectrum between little or no integration and full-featured integration with multiple cloud platforms.

Realistic Tests
Although testing from the cloud is, in many cases, more realistic than testing in the lab, simply moving to the cloud is not enough to ensure the most realistic tests. Real users often have access to less bandwidth than a load generator in a cloud data center. With a slower connection, the real user will have to wait longer than the load generator to download all the data needed for a web page or application. This has two major implications:

  • Response times measured as-is from the cloud with virtually unlimited bandwidth are better than for real users. This can lead test engineers to draw the wrong conclusions, thinking that users will see an acceptable response time when in reality they will not.
  • The total number of connections established with the server will increase, because on average, connections for real users will be open longer than connections for the load generator. This can lead to a situation in which the server unexpectedly refuses additional connections under load.

When choosing a load testing solution, look for one that provides a bandwidth simulation feature that limits bandwidth to ensure that the virtual users download the content of the web application at a realistic rate. This capability is particularly important when testing mobile applications, because mobile devices typically operate with less bandwidth than laptops and desktops.

Similarly, look for a solution that can parallelize requests. Modern browsers have the ability to parallelize HTTP requests as they retrieve a web page's static resources. These parallel requests require more connections with the server and can lengthen response times. Load testing solutions that do not parallelize requests are incapable of producing truly realistic performance tests for web applications.

Unified Lab Testing and Cloud Testing
Organizations that use only lab testing or only cloud testing are at a disadvantage. So are companies that use different tools for these activities.

A solution that supports lab testing enables test engineers to begin verifying the performance of an application internally, before it's ready to be made available via the Internet. This makes it possible to find and fix performance problems earlier in the application lifecycle. Such a solution also lowers cloud costs by enabling teams to conduct internal performance tests on existing hardware when available.

More important, a single solution that supports lab testing and cloud testing enables test engineers to reuse scripts for both kinds of tests, which saves time and effort. Reusing scripts also helps pinpoint performance problems that show up in cloud testing but not in internal tests. Last, a unified solution lowers licensing and training costs, and enables test engineers to use their existing skill set for both types of load testing.

Ease of Use
Testing, with its natural position toward the end of the application lifecycle, is almost always performed under tight time constraints. Delays in the requirements or implementation phases of a project usually result in less time for the test engineers to do their jobs. The pressure is on to deliver results as quickly as possible. This environment is no place for a tool that is difficult to use and configure.

In developing and executing performance tests (either internally or via the cloud) several key features go a long way in improving test engineer productivity, including support for:

  • Easily launching the recording of a virtual user profile (preferably in one click).
  • Defining advanced behaviors (with structures such as conditions and loops) via a graphical interface, complemented by the ability to use a scripting language (JavaScript, for example) for more complex cases.
  • Automatic handling of dynamic parameters. This includes a set of correlation rules for well-known server frameworks. Ideally, the solution will dynamically detect and handle custom parameters specific to your application.
  • Sharing common script parts, such as login or logout transactions, between multiple virtual user profiles.
  • Comparing results. Sifting through results to determine the effect of a particular application or infrastructure change can be a time-consuming and arduous task without a dedicated comparison tool.

This is a not exhaustive list of usability features that can help test engineers work more efficiently; rather it should be considered as a baseline of minimum required capabilities for an efficient load testing solution.

Analysis, Monitoring, Scheduling, and Reporting
Recording a virtual user profile and playing it back to get raw results is only the beginning of an effective performance test. You need tools to help you analyze the results (in real time when possible), find the root cause of problems, and produce actionable results.

Real-time analysis enables you to detect and understand issues while the test is running. With real-time analysis, you don't have to wait for the test to finish detecting an issue, correcting it, and restarting the test. When testing in production, real-time analysis enables you abort a test if it threatens to affect the performance experienced by real users.

A comprehensive monitoring system is essential when you need to find the root causes of a problem. Predefined performance counters and threshold alerts based on industry best practices make it easy to define and analyze counters. For a nonintrusive solution that is easier to set up, look for a tool that supports agentless remote monitoring.

If your organization performs regular regression tests - and even if it doesn't - you may want to schedule performance tests and execute them automatically via the command line to complement functional testing. Regularly scheduled load tests with automatically generated reports can help organizations detect performance regression as soon as it starts to occur, which makes it easier to pinpoint and correct.

Last, reporting is a key capability and essential for communicating test results to others on the team, including management. Because reporting needs change, it is a good idea to keep your options open with a tool that supports multiple formats, including PDF, Word, HTML, and XML for integration with other systems.

Support for Web Technologies
To test Siebel applications or applications built with Adobe Flex, Microsoft Silverlight, Real-Time Messaging Protocol (RTMP), Oracle Forms, or AJAX push technologies you need a load testing tool with built-in support for the technologies you're using. Without this specialized support it can be very difficult, if not impossible, to effectively test the performance of your applications.

Similarly, the load testing solution you choose should provide support for the authentication mechanism employed by your applications, whether it is Basic, Digest, NTLM, or Kerberos. Otherwise, you won't be able to set up a virtual user profile that tests the application as a real person would use it.

Summing It Up
The cloud is opening new opportunities to improve the scale and realism of load testing while saving time and lowering costs. When selecting a cloud testing solution, keep in mind that the primary factor in your success will not be simply the move to the cloud, but rather the tool you use and how well it uses cloud technology.

More Stories By Steve Weisfeldt

Steve Weisfeldt is a Senior Performance Engineer at Neotys, a provider of load testing software for Web applications. Previously, he has worked as the President of Engine 1 Consulting, a services firm specializing in all facets of test automation. Prior to his involvement at Engine 1 Consulting, he was a Senior Systems Engineer at Aternity. Prior to that, Steve spent seven years at automated testing vendor Segue Software (acquired by Borland). While spending most of his time at Segue delivering professional services and training, he was also involved in pre-sales and product marketing efforts.

Being in the load and performance testing space since 1999, Steve has been involved in load and performance testing projects of all sizes, in industries that span the retail, financial services, insurance and manufacturing sectors. His expertise lies in enabling organizations to optimize their ability to develop, test and launch high-quality applications efficiently, on-time and on-budget. Steve graduated from the University of Massachusetts-Lowell with a BS in Electrical Engineering and an MS in Computer Engineering.

Comments (0)

Share your thoughts on this story.

Add your comment
You must be signed in to add a comment. Sign-in | Register

In accordance with our Comment Policy, we encourage comments that are on topic, relevant and to-the-point. We will remove comments that include profanity, personal attacks, racial slurs, threats of violence, or other inappropriate material that violates our Terms and Conditions, and will block users who make repeated violations. We ask all readers to expect diversity of opinion and to treat one another with dignity and respect.


@CloudExpo Stories
WebRTC services have already permeated corporate communications in the form of videoconferencing solutions. However, WebRTC has the potential of going beyond and catalyzing a new class of services providing more than calls with capabilities such as mass-scale real-time media broadcasting, enriched and augmented video, person-to-machine and machine-to-machine communications. In his session at @ThingsExpo, Luis Lopez, CEO of Kurento, introduced the technologies required for implementing these idea...
"A lot of times people will come to us and have a very diverse set of requirements or very customized need and we'll help them to implement it in a fashion that you can't just buy off of the shelf," explained Nick Rose, CTO of Enzu, in this SYS-CON.tv interview at 18th Cloud Expo, held June 7-9, 2016, at the Javits Center in New York City, NY.
Buzzword alert: Microservices and IoT at a DevOps conference? What could possibly go wrong? In this Power Panel at DevOps Summit, moderated by Jason Bloomberg, the leading expert on architecting agility for the enterprise and president of Intellyx, panelists peeled away the buzz and discuss the important architectural principles behind implementing IoT solutions for the enterprise. As remote IoT devices and sensors become increasingly intelligent, they become part of our distributed cloud enviro...
Containers have changed the mind of IT in DevOps. They enable developers to work with dev, test, stage and production environments identically. Containers provide the right abstraction for microservices and many cloud platforms have integrated them into deployment pipelines. DevOps and containers together help companies achieve their business goals faster and more effectively. In his session at DevOps Summit, Ruslan Synytsky, CEO and Co-founder of Jelastic, reviewed the current landscape of Dev...
The WebRTC Summit New York, to be held June 6-8, 2017, at the Javits Center in New York City, NY, announces that its Call for Papers is now open. Topics include all aspects of improving IT delivery by eliminating waste through automated business models leveraging cloud technologies. WebRTC Summit is co-located with 20th International Cloud Expo and @ThingsExpo. WebRTC is the future of browser-to-browser communications, and continues to make inroads into the traditional, difficult, plug-in web co...
For basic one-to-one voice or video calling solutions, WebRTC has proven to be a very powerful technology. Although WebRTC’s core functionality is to provide secure, real-time p2p media streaming, leveraging native platform features and server-side components brings up new communication capabilities for web and native mobile applications, allowing for advanced multi-user use cases such as video broadcasting, conferencing, and media recording.
In his keynote at @ThingsExpo, Chris Matthieu, Director of IoT Engineering at Citrix and co-founder and CTO of Octoblu, focused on building an IoT platform and company. He provided a behind-the-scenes look at Octoblu’s platform, business, and pivots along the way (including the Citrix acquisition of Octoblu).
SYS-CON Events announced today that Enzu will exhibit at SYS-CON's 20th International Cloud Expo®, which will take place on June 6-8, 2017, at the Javits Center in New York City, NY, and the 21st International Cloud Expo®, which will take place October 31-November 2, 2017, at the Santa Clara Convention Center in Santa Clara, CA. Enzu’s mission is to be the leading provider of enterprise cloud solutions worldwide. Enzu enables online businesses to use its IT infrastructure to their competitive ad...
"Plutora provides release and testing environment capabilities to the enterprise," explained Dalibor Siroky, Director and Co-founder of Plutora, in this SYS-CON.tv interview at @DevOpsSummit, held June 9-11, 2015, at the Javits Center in New York City.
DevOps tends to focus on the relationship between Dev and Ops, putting an emphasis on the ops and application infrastructure. But that’s changing with microservices architectures. In her session at DevOps Summit, Lori MacVittie, Evangelist for F5 Networks, will focus on how microservices are changing the underlying architectures needed to scale, secure and deliver applications based on highly distributed (micro) services and why that means an expansion into “the network” for DevOps.
"We are an all-flash array storage provider but our focus has been on VM-aware storage specifically for virtualized applications," stated Dhiraj Sehgal of Tintri in this SYS-CON.tv interview at 19th Cloud Expo, held November 1-3, 2016, at the Santa Clara Convention Center in Santa Clara, CA.
Choosing the right cloud for your workloads is a balancing act that can cost your organization time, money and aggravation - unless you get it right the first time. Economics, speed, performance, accessibility, administrative needs and security all play a vital role in dictating your approach to the cloud. Without knowing the right questions to ask, you could wind up paying for capacity you'll never need or underestimating the resources required to run your applications.
WebRTC is about the data channel as much as about video and audio conferencing. However, basically all commercial WebRTC applications have been built with a focus on audio and video. The handling of “data” has been limited to text chat and file download – all other data sharing seems to end with screensharing. What is holding back a more intensive use of peer-to-peer data? In her session at @ThingsExpo, Dr Silvia Pfeiffer, WebRTC Applications Team Lead at National ICT Australia, looked at differ...
Adding public cloud resources to an existing application can be a daunting process. The tools that you currently use to manage the software and hardware outside the cloud aren’t always the best tools to efficiently grow into the cloud. All of the major configuration management tools have cloud orchestration plugins that can be leveraged, but there are also cloud-native tools that can dramatically improve the efficiency of managing your application lifecycle. In his session at 18th Cloud Expo, ...
Security, data privacy, reliability and regulatory compliance are critical factors when evaluating whether to move business applications from in-house client hosted environments to a cloud platform. In her session at 18th Cloud Expo, Vandana Viswanathan, Associate Director at Cognizant, In this session, will provide an orientation to the five stages required to implement a cloud hosted solution validation strategy.
The security needs of IoT environments require a strong, proven approach to maintain security, trust and privacy in their ecosystem. Assurance and protection of device identity, secure data encryption and authentication are the key security challenges organizations are trying to address when integrating IoT devices. This holds true for IoT applications in a wide range of industries, for example, healthcare, consumer devices, and manufacturing. In his session at @ThingsExpo, Lancen LaChance, vic...
With the proliferation of both SQL and NoSQL databases, organizations can now target specific fit-for-purpose database tools for their different application needs regarding scalability, ease of use, ACID support, etc. Platform as a Service offerings make this even easier now, enabling developers to roll out their own database infrastructure in minutes with minimal management overhead. However, this same amount of flexibility also comes with the challenges of picking the right tool, on the right ...
With all the incredible momentum behind the Internet of Things (IoT) industry, it is easy to forget that not a single CEO wakes up and wonders if “my IoT is broken.” What they wonder is if they are making the right decisions to do all they can to increase revenue, decrease costs, and improve customer experience – effectively the same challenges they have always had in growing their business. The exciting thing about the IoT industry is now these decisions can be better, faster, and smarter. Now ...
Security, data privacy, reliability, and regulatory compliance are critical factors when evaluating whether to move business applications from in-house, client-hosted environments to a cloud platform. Quality assurance plays a vital role in ensuring that the appropriate level of risk assessment, verification, and validation takes place to ensure business continuity during the migration to a new cloud platform.
"Splunk basically takes machine data and we make it usable, valuable and accessible for everyone. The way that plays in DevOps is - we need to make data-driven decisions to delivering applications," explained Andi Mann, Chief Technology Advocate at Splunk and @DevOpsSummit Conference Chair, in this SYS-CON.tv interview at @DevOpsSummit at 19th Cloud Expo, held November 1-3, 2016, at the Santa Clara Convention Center in Santa Clara, CA.