Welcome!

Cloud Expo Authors: Nikita Ivanov, Jerry Melnick, Liz McMillan, Elizabeth White, Esmeralda Swartz

Related Topics: Cloud Expo, SOA & WOA, Adobe Flex, Virtualization

Cloud Expo: Article

Benefits of Load Testing in the Cloud (Part 1)

How to choose the right approach

Many companies have moved applications to the cloud as a way to reduce capital expenditure while improving IT focus and effectiveness. End users see the cloud as a way to access their documents and applications remotely from anywhere and from any device. IT managers see the cloud as a means of rapidly adapting their infrastructures as needed via virtualization and a pay-per use-model. But what about load testing engineers? Can they seize the opportunities afforded by the cloud to better test the performance of web applications?

As with past overhyped trends in IT, it is important to see past all the talk and look for concrete ways to take advantage of this new technology's flexibility and scalability to save time, reduce costs, and improve the way your organization works.

This article describes how the cloud is revolutionizing load testing and the advantages it provides in many situations for ensuring your web applications perform well in production. It also covers key capabilities to look for in a load testing solution. Without the right tools in place, simply moving your testing activities to the cloud will likely not deliver the results necessary to justify the move. Understanding how to apply the right tools and practices to make the most of the cloud is fundamental to cloud-based testing and vital to ultimately going live with total peace of mind.

Benefits of Load Testing in the Cloud
Load testing with the cloud enables testing teams to take a big step forward in conducting more efficient and more realistic large-scale tests. In addition, it enables organizations to realize significant savings in cost and time made possible by cloud technology.

Perform Large-Scale Tests
More and more, today's web applications are experiencing sporadic surges in traffic. These traffic spikes can have many causes, including a new advertising campaign, an online article, a seasonal sale, and buzz on Twitter or other social media. If your application is unable to handle the increased load, you run the risk of lost business opportunities and potential damage to your brand.

Generating the load for large-scale tests to mimic these unanticipated spikes in production traffic, however, typically requires tens or even hundreds of machines. Purchasing and configuring these systems requires a significant investment of time and money. Once acquired and used for the immediate load testing need, the machines may sit unused for long stretches until they are needed for the next large-scale load testing project. With the cloud, you can rapidly set up as many load-generating machines as you need, on demand.

Perform More Realistic Tests
When testing a web application using machines inside your firewall, you're not testing the entire delivery chain. Unless all of your end users will also be within your firewall, such tests are inherently limited and may fail to reveal all performance issues.

With the cloud, you can execute load tests that access your web application as your users will - from outside of your firewall - and validate all components of the delivery chain, including the firewall, DNS, network equipment, and ISP. These tests are more realistic, and they enable you to evaluate the real-world effects of third-party components, such as content delivery networks, analytics servers, and ad servers.

Your users won't all be accessing your app from the same fixed location across the same network, so a realistic load test cannot be completed from a single location. That's why it's important to test your application and its components from different locations and geographic regions and assess its performance as network bandwidth and latency changes.

Save Time and Reduce Costs with Pay-as-you-go
When load testing with the cloud, there is no need to spend weeks setting up and configuring dozens of real machines. You can create and configure the machine image you need once and then replicate it in the cloud as many times as needed. Often, the cloud testing provider will automate this process as well, saving you even more time.

Further, the substantial up-front costs of purchasing and maintaining machines that may be used only infrequently are eliminated with the cloud. Using the pay-as-you-go model, you can rapidly set up the testing infrastructure you need, when you need it, and only for as long as you need it. From a business standpoint, the cloud lowers total cost of ownership, while increasing flexibility.

How to Choose a Cloud Testing Solution
While all cloud load testing solutions will enable you to make use of the cloud in some way, comparatively few enable you to follow all of the best practices outlined here and capitalize on the opportunities that load testing with the cloud offers. A highway lets you travel faster than a side street, but the vehicle you use makes a big difference in how quickly and how reliably you arrive at your destination. In much the same way, load testing with the cloud offers clear advantages over traditional load testing, but the tools you use are even more important to the quality of your tests.

When considering a cloud testing solution, ask the following questions:

  1. To what extent does the solution integrate with the cloud?
  2. Will the solution enable us to conduct realistic tests?
  3. Does the solution support unified tests inside and outside the firewall?
  4. Is the solution easy to use, or will we spend weeks learning and configuring it?
  5. Does the solution include full-featured reporting and decision-making modules to help our team make the most of the results?
  6. Does the solution support the technologies we used to build the application?

Integration with the Cloud Platform
If you opt for a solution that is not integrated with one or more cloud platforms, you'll need to handle several time-consuming tasks on your own. First, you'll need to learn how each platform you'll be using works, including its limitations and constraints. Second, you'll need to build, test, and maintain your own virtual machine images.

Load testing solutions that offer integration with the cloud simplify and accelerate the steps needed to use the cloud infrastructure. These solutions offer one or more of the following advantages over non-integrated alternatives:

  • Fast provisioning using preconfigured images. You can set up the infrastructure you need in minutes.
  • Simplified security. All required protections are set up by default, including firewall, certificates, and encryption.
  • Improved scalability. Leading load testing solution providers have negotiated with cloud providers to allow users of their software to employ more virtual machines (for the purpose of load testing) than are allowed by default.
  • A unified interface for multiple cloud providers. Load testing solutions can hide provisioning and billing details, so you can take maximum advantage of the cloud in a minimum of time.
  • Advanced test launching. You can save time and effort by defining and launching load generators in the cloud directly from the load testing interface.
  • Advanced results reporting. Distinct results from each geographic region involved in the test are available for analysis.

Of course, few solutions include every one of these integration capabilities. Most solutions fall somewhere on the spectrum between little or no integration and full-featured integration with multiple cloud platforms.

Realistic Tests
Although testing from the cloud is, in many cases, more realistic than testing in the lab, simply moving to the cloud is not enough to ensure the most realistic tests. Real users often have access to less bandwidth than a load generator in a cloud data center. With a slower connection, the real user will have to wait longer than the load generator to download all the data needed for a web page or application. This has two major implications:

  • Response times measured as-is from the cloud with virtually unlimited bandwidth are better than for real users. This can lead test engineers to draw the wrong conclusions, thinking that users will see an acceptable response time when in reality they will not.
  • The total number of connections established with the server will increase, because on average, connections for real users will be open longer than connections for the load generator. This can lead to a situation in which the server unexpectedly refuses additional connections under load.

When choosing a load testing solution, look for one that provides a bandwidth simulation feature that limits bandwidth to ensure that the virtual users download the content of the web application at a realistic rate. This capability is particularly important when testing mobile applications, because mobile devices typically operate with less bandwidth than laptops and desktops.

Similarly, look for a solution that can parallelize requests. Modern browsers have the ability to parallelize HTTP requests as they retrieve a web page's static resources. These parallel requests require more connections with the server and can lengthen response times. Load testing solutions that do not parallelize requests are incapable of producing truly realistic performance tests for web applications.

Unified Lab Testing and Cloud Testing
Organizations that use only lab testing or only cloud testing are at a disadvantage. So are companies that use different tools for these activities.

A solution that supports lab testing enables test engineers to begin verifying the performance of an application internally, before it's ready to be made available via the Internet. This makes it possible to find and fix performance problems earlier in the application lifecycle. Such a solution also lowers cloud costs by enabling teams to conduct internal performance tests on existing hardware when available.

More important, a single solution that supports lab testing and cloud testing enables test engineers to reuse scripts for both kinds of tests, which saves time and effort. Reusing scripts also helps pinpoint performance problems that show up in cloud testing but not in internal tests. Last, a unified solution lowers licensing and training costs, and enables test engineers to use their existing skill set for both types of load testing.

Ease of Use
Testing, with its natural position toward the end of the application lifecycle, is almost always performed under tight time constraints. Delays in the requirements or implementation phases of a project usually result in less time for the test engineers to do their jobs. The pressure is on to deliver results as quickly as possible. This environment is no place for a tool that is difficult to use and configure.

In developing and executing performance tests (either internally or via the cloud) several key features go a long way in improving test engineer productivity, including support for:

  • Easily launching the recording of a virtual user profile (preferably in one click).
  • Defining advanced behaviors (with structures such as conditions and loops) via a graphical interface, complemented by the ability to use a scripting language (JavaScript, for example) for more complex cases.
  • Automatic handling of dynamic parameters. This includes a set of correlation rules for well-known server frameworks. Ideally, the solution will dynamically detect and handle custom parameters specific to your application.
  • Sharing common script parts, such as login or logout transactions, between multiple virtual user profiles.
  • Comparing results. Sifting through results to determine the effect of a particular application or infrastructure change can be a time-consuming and arduous task without a dedicated comparison tool.

This is a not exhaustive list of usability features that can help test engineers work more efficiently; rather it should be considered as a baseline of minimum required capabilities for an efficient load testing solution.

Analysis, Monitoring, Scheduling, and Reporting
Recording a virtual user profile and playing it back to get raw results is only the beginning of an effective performance test. You need tools to help you analyze the results (in real time when possible), find the root cause of problems, and produce actionable results.

Real-time analysis enables you to detect and understand issues while the test is running. With real-time analysis, you don't have to wait for the test to finish detecting an issue, correcting it, and restarting the test. When testing in production, real-time analysis enables you abort a test if it threatens to affect the performance experienced by real users.

A comprehensive monitoring system is essential when you need to find the root causes of a problem. Predefined performance counters and threshold alerts based on industry best practices make it easy to define and analyze counters. For a nonintrusive solution that is easier to set up, look for a tool that supports agentless remote monitoring.

If your organization performs regular regression tests - and even if it doesn't - you may want to schedule performance tests and execute them automatically via the command line to complement functional testing. Regularly scheduled load tests with automatically generated reports can help organizations detect performance regression as soon as it starts to occur, which makes it easier to pinpoint and correct.

Last, reporting is a key capability and essential for communicating test results to others on the team, including management. Because reporting needs change, it is a good idea to keep your options open with a tool that supports multiple formats, including PDF, Word, HTML, and XML for integration with other systems.

Support for Web Technologies
To test Siebel applications or applications built with Adobe Flex, Microsoft Silverlight, Real-Time Messaging Protocol (RTMP), Oracle Forms, or AJAX push technologies you need a load testing tool with built-in support for the technologies you're using. Without this specialized support it can be very difficult, if not impossible, to effectively test the performance of your applications.

Similarly, the load testing solution you choose should provide support for the authentication mechanism employed by your applications, whether it is Basic, Digest, NTLM, or Kerberos. Otherwise, you won't be able to set up a virtual user profile that tests the application as a real person would use it.

Summing It Up
The cloud is opening new opportunities to improve the scale and realism of load testing while saving time and lowering costs. When selecting a cloud testing solution, keep in mind that the primary factor in your success will not be simply the move to the cloud, but rather the tool you use and how well it uses cloud technology.

More Stories By Steve Weisfeldt

Steve Weisfeldt is a Senior Performance Engineer at Neotys, a provider of load testing software for Web applications. Previously, he has worked as the President of Engine 1 Consulting, a services firm specializing in all facets of test automation. Prior to his involvement at Engine 1 Consulting, he was a Senior Systems Engineer at Aternity. Prior to that, Steve spent seven years at automated testing vendor Segue Software (acquired by Borland). While spending most of his time at Segue delivering professional services and training, he was also involved in pre-sales and product marketing efforts.

Being in the load and performance testing space since 1999, Steve has been involved in load and performance testing projects of all sizes, in industries that span the retail, financial services, insurance and manufacturing sectors. His expertise lies in enabling organizations to optimize their ability to develop, test and launch high-quality applications efficiently, on-time and on-budget. Steve graduated from the University of Massachusetts-Lowell with a BS in Electrical Engineering and an MS in Computer Engineering.

Comments (0)

Share your thoughts on this story.

Add your comment
You must be signed in to add a comment. Sign-in | Register

In accordance with our Comment Policy, we encourage comments that are on topic, relevant and to-the-point. We will remove comments that include profanity, personal attacks, racial slurs, threats of violence, or other inappropriate material that violates our Terms and Conditions, and will block users who make repeated violations. We ask all readers to expect diversity of opinion and to treat one another with dignity and respect.


Cloud Expo Breaking News
MapDB is an Apache-licensed open source database specifically designed for Java developers. The library uses the standard Java Collections API, making it totally natural for Java developers to use and adopt, while scaling database size from GBs to TBs. MapDB is very fast and supports an agile approach to data, allowing developers to construct flexible schemas to exactly match application needs and tune performance, durability and caching for specific requirements.
Web conferencing in a public cloud has the same risks as any other cloud service. If you have ever had concerns over the types of data being shared in your employees’ web conferences, such as IP, financials or customer data, then it’s time to look at web conferencing in a private cloud. In her session at 14th Cloud Expo, Courtney Behrens, Senior Marketing Manager at Brother International, will discuss how issues that had previously been out of your control, like performance, advanced administration and compliance, can now be put back behind your firewall.
More and more enterprises today are doing business by opening up their data and applications through APIs. Though forward-thinking and strategic, exposing APIs also increases the surface area for potential attack by hackers. To benefit from APIs while staying secure, enterprises and security architects need to continue to develop a deep understanding about API security and how it differs from traditional web application security or mobile application security. In his session at 14th Cloud Expo, Sachin Agarwal, VP of Product Marketing and Strategy at SOA Software, will walk you through the various aspects of how an API could be potentially exploited. He will discuss the necessary best practices to secure your data and enterprise applications while continue continuing to support your business’s digital initiatives.
Next-Gen Cloud. Whatever you call it, there’s a higher calling for cloud computing that requires providers to change their spots and move from a commodity mindset to a premium one. Businesses can no longer maintain the status quo that today’s service providers offer. Yes, the continuity, speed, mobility, data access and connectivity are staples of the cloud and always will be. But cloud providers that plan to not only exist tomorrow – but to lead – know that security must be the top priority for the cloud and are delivering it now. In his session at 14th Cloud Expo, Kurt Hagerman, Chief Information Security Officer at FireHost, will detail why and how you can have both infrastructure performance and enterprise-grade security – and what tomorrow's cloud provider will look like.
The social media expansion has shown just how people are eager to share their experiences with the rest of the world. Cloud technology is the perfect platform to satisfy this need given its great flexibility and readiness. At Cynny, we aim to revolutionize how people share and organize their digital life through a brand new cloud service, starting from infrastructure to the users’ interface. A revolution that began from inventing and designing our very own infrastructure: we have created the first server network powered solely by ARM CPU. The microservers have “organism-like” features, differentiating them from any of the current technologies. Benefits include low consumption of energy, making Cynny the ecologically friendly alternative for storage as well as cheaper infrastructure, lower running costs, etc.
The revolution that happened in the server universe over the past 15 years has resulted in an eco-system that is more open, more democratically innovative and produced better results in technically challenging dimensions like scale. The underpinnings of the revolution were common hardware, standards based APIs (ex. POSIX) and a strict adherence to layering and isolation between applications, daemons and kernel drivers/modules which allowed multiple types of development happen in parallel without hindering others. Put simply, today's server model is built on a consistent x86 platform with few surprises in its core components. A kernel abstracts away the platform, so that applications and daemons are decoupled from the hardware. In contrast, networking equipment is still stuck in the mainframe era. Today, networking equipment is a single appliance, including hardware, OS, applications and user interface come as a monolithic entity from a single vendor. Switching between different vendor'...
Cloud backup and recovery services are critical to safeguarding an organization’s data and ensuring business continuity when technical failures and outages occur. With so many choices, how do you find the right provider for your specific needs? In his session at 14th Cloud Expo, Daniel Jacobson, Technology Manager at BUMI, will outline the key factors including backup configurations, proactive monitoring, data restoration, disaster recovery drills, security, compliance and data center resources. Aside from the technical considerations, the secret sauce in identifying the best vendor is the level of focus, expertise and specialization of their engineering team and support group, and how they monitor your day-to-day backups, provide recommendations, and guide you through restores when necessary.
Cloud scalability and performance should be at the heart of every successful Internet venture. The infrastructure needs to be resilient, flexible, and fast – it’s best not to get caught thinking about architecture until the middle of an emergency, when it's too late. In his interactive, no-holds-barred session at 14th Cloud Expo, Phil Jackson, Development Community Advocate for SoftLayer, will dive into how to design and build-out the right cloud infrastructure.
You use an agile process; your goal is to make your organization more agile. What about your data infrastructure? The truth is, today’s databases are anything but agile – they are effectively static repositories that are cumbersome to work with, difficult to change, and cannot keep pace with application demands. Performance suffers as a result, and it takes far longer than it should to deliver on new features and capabilities needed to make your organization competitive. As your application and business needs change, data repositories and structures get outmoded rapidly, resulting in increased work for application developers and slow performance for end users. Further, as data sizes grow into the Big Data realm, this problem is exacerbated and becomes even more difficult to address. A seemingly simple schema change can take hours (or more) to perform, and as requirements evolve the disconnect between existing data structures and actual needs diverge.
SYS-CON Events announced today that SherWeb, a long-time leading provider of cloud services and Microsoft's 2013 World Hosting Partner of the Year, will exhibit at SYS-CON's 14th International Cloud Expo®, which will take place on June 10–12, 2014, at the Javits Center in New York City, New York. A worldwide hosted services leader ranking in the prestigious North American Deloitte Technology Fast 500TM, and Microsoft's 2013 World Hosting Partner of the Year, SherWeb provides competitive cloud solutions to businesses and partners around the world. Founded in 1998, SherWeb is a privately owned company headquartered in Quebec, Canada. Its service portfolio includes Microsoft Exchange, SharePoint, Lync, Dynamics CRM and more.
The world of cloud and application development is not just for the hardened developer these days. In their session at 14th Cloud Expo, Phil Jackson, Development Community Advocate for SoftLayer, and Harold Hannon, Sr. Software Architect at SoftLayer, will pull back the curtain of the architecture of a fun demo application purpose-built for the cloud. They will focus on demonstrating how they leveraged compute, storage, messaging, and other cloud elements hosted at SoftLayer to lower the effort and difficulty of putting together a useful application. This will be an active demonstration and review of simple command-line tools and resources, so don’t be afraid if you are not a seasoned developer.
SYS-CON Events announced today that BUMI, a premium managed service provider specializing in data backup and recovery, will exhibit at SYS-CON's 14th International Cloud Expo®, which will take place on June 10–12, 2014, at the Javits Center in New York City, New York. Manhattan-based BUMI (Backup My Info!) is a premium managed service provider specializing in data backup and recovery. Founded in 2002, the company’s Here, There and Everywhere data backup and recovery solutions are utilized by more than 500 businesses. BUMI clients include professional service organizations such as banking, financial, insurance, accounting, hedge funds and law firms. The company is known for its relentless passion for customer service and support, and has won numerous awards, including Customer Service Provider of the Year and 10 Best Companies to Work For.
Chief Security Officers (CSO), CIOs and IT Directors are all concerned with providing a secure environment from which their business can innovate and customers can safely consume without the fear of Distributed Denial of Service attacks. To be successful in today's hyper-connected world, the enterprise needs to leverage the capabilities of the web and be ready to innovate without fear of DDoS attacks, concerns about application security and other threats. Organizations face great risk from increasingly frequent and sophisticated attempts to render web properties unavailable, and steal intellectual property or personally identifiable information. Layered security best practices extend security beyond the data center, delivering DDoS protection and maintaining site performance in the face of fast-changing threats.
From data center to cloud to the network. In his session at 3rd SDDC Expo, Raul Martynek, CEO of Net Access, will identify the challenges facing both data center providers and enterprise IT as they relate to cross-platform automation. He will then provide insight into designing, building, securing and managing the technology as an integrated service offering. Topics covered include: High-density data center design Network (and SDN) integration and automation Cloud (and hosting) infrastructure considerations Monitoring and security Management approaches Self-service and automation
In his session at 14th Cloud Expo, David Holmes, Vice President at OutSystems, will demonstrate the immense power that lives at the intersection of mobile apps and cloud application platforms. Attendees will participate in a live demonstration – an enterprise mobile app will be built and changed before their eyes – on their own devices. David Holmes brings over 20 years of high-tech marketing leadership to OutSystems. Prior to joining OutSystems, he was VP of Global Marketing for Damballa, a leading provider of network security solutions. Previously, he was SVP of Global Marketing for Jacada where his branding and positioning expertise helped drive the company from start-up days to a $55 million initial public offering on Nasdaq.