Welcome!

@CloudExpo Authors: Liz McMillan, Pat Romanski, Elizabeth White, Jason Bloomberg, Leon Adato

Related Topics: @CloudExpo, Microservices Expo, Containers Expo Blog, Agile Computing

@CloudExpo: Blog Post

Considerations for Choosing a Cloud Load Testing Solution

How to pick a tool to load test in the cloud

In a previous post, I've discussed how load testing with the cloud can enable your team to conduct more efficient and more realistic large scale tests while saving you (time and) money. With this all in mind, how do we go about choosing the correct cloud load testing solution?

There are many cloud load testing solutions that will enable you to make use of the cloud. However, very few enable you to capitalize on the opportunities of load testing in the cloud. As I've discussed, load testing with the cloud offers clear advantages over traditional load testing in certain circumstances, but the tools you use are even more important to the quality of your tests.

When considering a cloud testing solution, ask the following questions:

  • To what extent does the solution integrate with the cloud?
  • Will the solution enable us to conduct realistic tests?
  • Does the solution support unified tests inside and outside the firewall?
  • Is the solution easy to use, or will we spend weeks learning and configuring it?
  • Does the solution include full-featured reporting and decision-making modules to help our team make the most of the results?
  • Does the solution support the technologies we used to build the application?

Much of this is outlined in the Neotys Cloud Whitepaper, but for the purposes of this blog, let's briefly talk about each of these:

Integration with the Cloud Platform
If you choose a solution that is not integrated with one or more cloud platforms, there are several manual steps you'll need to perform on your own. First, you'll need to learn how each platform you'll be using works, including its limitations and constraints. Second, you'll need to build, maintain and access the machines you wish to use.

Load testing solutions that offer integration with the cloud simplify and accelerate the steps needed to use the cloud infrastructure. With the proper solution, connecting to, and utilizing the cloud should be possible with just a few mouse clicks.

Realistic Load Tests
Simply moving to the cloud is not always enough to ensure the most realistic tests. Real users often have access to less bandwidth than a load generator in a cloud data center. With a slower connection, the real user will have to wait longer than the load generator to download all the data needed for a web page or application. This has some major implications as your performance results won't be fully reflective of what your real users will experience (here we go again with the inaccurate load testing).

When choosing a load testing solution, look for one that provides a bandwidth simulation feature that limits bandwidth to ensure that the virtual users download the content of the web application at a realistic rate. This capability is particularly important when testing mobile applications, because mobile devices typically operate with less bandwidth than laptops and desktops.

Unified Lab Testing and Cloud Testing
While including the cloud in our load testing is obviously a huge advantage, it does not mean that we should exclude testing from within our own firewall. Both types of testing should be included - if we are only doing one or the other, we are again running into a disadvantage. Likewise if we use different solutions for these different tests.

When choosing a load testing solution, you should make sure that it supports lab testing so you can confirm the performance on your application(s) internally, before you introduce additional variables by testing over the internet. This allows you to shake out any early performance issues with smaller loads before you execute tests from the cloud. Finding these issues earlier in the application's lifecycle will lower costs associated with not only the cloud but also those attributed to "late stage bugs".

Making sure that you use a single solution for both "internal" (within the lab and/or firewall) and "external" (via The Cloud) testing allows your engineers to reuse scripts across both types of tests. If you choose different solutions, you may have to create a new set of scripts for your external testing, costing you both time and money.

Ease of Use
As every test engineer knows (not just the performance testers, the functional testers can relate too!), testing is almost always performed under tight time constraints. Delays in development or other phases earlier in the lifecycle result in less time for the test engineers to do their jobs. The pressure is on to deliver results as quickly as possible. With this kind of stress, there is no place for a solution that is hard to use. Finding a tool that is easy to use is imperative. Not only does it save2 time and money, but it will allow you to get more testing done in the time you have - ensuring a higher quality application. While an exhaustive list of "ease of use" features might be best suited for another blog article, some key things you should look for are:

  • Easy recording of a virtual user profile (preferably in one click).
  • The ability to easily define advanced behaviors (with structures such as conditions and loops) via a graphical interface, without a need for writing code or test scripts.
  • Automatic handling of session parameters. This is probably the most challenging and time-consuming part of in the development of load testing user profiles. Find a tool that does this for you automatically.
  • Easy results comparisons. Generating graphs and charts based on a lot of raw data can be excruciating - a tool that automates this ability is a must.

Analysis, Monitoring, and Reporting
Creating realistic scenarios and running comprehensive load tests is only the part of the story. The final steps of course are to understand how your application behaved while it was subjected to the test load. Did it meet performance requirements? If not, why? To do this analysis, you need a powerful set of analysis tools to analyze the data generated during the test.

Make sure you choose a solution that allows you to easily analyze the collected data, creating actionable reports that describe the performance of the tested application. I've seen too many a test engineer spend hours, if not days taking raw data from a load testing tool and pushing it through Excel or a statistical tool to create graphs, charts and reports. A tool that allows for easy analysis not only makes this process pain-free but can increase the collaboration among the different stakeholders of the application's infrastructure.

Making sure your load testing solution contains a comprehensive monitoring system is also essential when you need to find the root causes of a problem. If you are only looking at performance measure collected from the end-users point of view, you're missing half the picture. Being able to use a monitoring system allows you to understand what was happening within the infrastructure of your web application is critical in identifying any root issues. Moreover, the selected load testing solution should allow for easy correlation between data collected from the end-users point of view and data collected from any back-end servers. Without this ability, results analysis becomes extremely cumbersome as well as potentially inaccurate. Too many times in my load testing career I've tried to correlate performance data collected from a load testing tool with that of data collected from separate application server and database server monitoring tools. An almost impossible task.

Support for Web Technologies
This one is pretty simple. If your application has been developed using more advanced web technologies, make sure the solution you choose has the proper support. With applications built with Adobe Flex, Microsoft Silverlight, Real-Time Messaging Protocol (RTMP), AJAX push technologies, etc. becoming more and more prevalent, the proper technology support is critical. Without this support, effectively testing the performance of your application is next to impossible. I'm sure that any of you that have tried to load test a Siebel application are nodding your heads as you read this.

While considering the above topics to search for a cloud provider, you may also want to consider using MULTIPLE cloud computing providers. There are several advantages to this approach.

First, multiple providers may allow you to test from more geographical regions. Earlier, we've talked about how testing from multiple regions can provide a more realistic test scenario. With this in mind, combine the regions available from multiple providers can lead to an even MORE realistic scenario. In other words, if a single provider does not give you the geographical coverage you need to emulate (based on where the real world users of your application "live"), aggregating multiple regions can be very useful.

Second, if you are executing exceptionally large scale tests, engaging multiple providers simultaneously allows you to bypass any limitations that a single provider may place on bandwidth or the number of machines in use.

Finally, using multiple cloud providers enables you to detect potential network issues at the cloud provider level. Let's say that during results analysis of particular test run you noting that there is significantly worse performance measure from load generators from a particular cloud provider while all other load generator show acceptable performance. With this data, you can safely conclude that there is a problem (temporary or not) with that provider and not your application. If you had locked yourself into a single cloud provider, you might be limited in your ability to conduct accurate large-scale tests.

More Stories By Steve Weisfeldt

Steve Weisfeldt is a Senior Performance Engineer at Neotys, a provider of load testing software for Web applications. Previously, he has worked as the President of Engine 1 Consulting, a services firm specializing in all facets of test automation. Prior to his involvement at Engine 1 Consulting, he was a Senior Systems Engineer at Aternity. Prior to that, Steve spent seven years at automated testing vendor Segue Software (acquired by Borland). While spending most of his time at Segue delivering professional services and training, he was also involved in pre-sales and product marketing efforts.

Being in the load and performance testing space since 1999, Steve has been involved in load and performance testing projects of all sizes, in industries that span the retail, financial services, insurance and manufacturing sectors. His expertise lies in enabling organizations to optimize their ability to develop, test and launch high-quality applications efficiently, on-time and on-budget. Steve graduated from the University of Massachusetts-Lowell with a BS in Electrical Engineering and an MS in Computer Engineering.

@CloudExpo Stories
Enterprise architects are increasingly adopting multi-cloud strategies as they seek to utilize existing data center assets, leverage the advantages of cloud computing and avoid cloud vendor lock-in. This requires a globally aware traffic management strategy that can monitor infrastructure health across data centers and end-user experience globally, while responding to control changes and system specification at the speed of today’s DevOps teams. In his session at 20th Cloud Expo, Josh Gray, Chie...
"We're a cybersecurity firm that specializes in engineering security solutions both at the software and hardware level. Security cannot be an after-the-fact afterthought, which is what it's become," stated Richard Blech, Chief Executive Officer at Secure Channels, in this SYS-CON.tv interview at @ThingsExpo, held November 1-3, 2016, at the Santa Clara Convention Center in Santa Clara, CA.
In his session at 20th Cloud Expo, Mike Johnston, an infrastructure engineer at Supergiant.io, discussed how to use Kubernetes to set up a SaaS infrastructure for your business. Mike Johnston is an infrastructure engineer at Supergiant.io with over 12 years of experience designing, deploying, and maintaining server and workstation infrastructure at all scales. He has experience with brick and mortar data centers as well as cloud providers like Digital Ocean, Amazon Web Services, and Rackspace. H...
You know you need the cloud, but you’re hesitant to simply dump everything at Amazon since you know that not all workloads are suitable for cloud. You know that you want the kind of ease of use and scalability that you get with public cloud, but your applications are architected in a way that makes the public cloud a non-starter. You’re looking at private cloud solutions based on hyperconverged infrastructure, but you’re concerned with the limits inherent in those technologies.
SYS-CON Events announced today that Massive Networks will exhibit at SYS-CON's 21st International Cloud Expo®, which will take place on Oct 31 – Nov 2, 2017, at the Santa Clara Convention Center in Santa Clara, CA. Massive Networks mission is simple. To help your business operate seamlessly with fast, reliable, and secure internet and network solutions. Improve your customer's experience with outstanding connections to your cloud.
SYS-CON Events announced today that SkyScale will exhibit at SYS-CON's 21st International Cloud Expo®, which will take place on Oct 31 – Nov 2, 2017, at the Santa Clara Convention Center in Santa Clara, CA. SkyScale is a world-class provider of cloud-based, ultra-fast multi-GPU hardware platforms for lease to customers desiring the fastest performance available as a service anywhere in the world. SkyScale builds, configures, and manages dedicated systems strategically located in maximum-security...
DevOps is under attack because developers don’t want to mess with infrastructure. They will happily own their code into production, but want to use platforms instead of raw automation. That’s changing the landscape that we understand as DevOps with both architecture concepts (CloudNative) and process redefinition (SRE). Rob Hirschfeld’s recent work in Kubernetes operations has led to the conclusion that containers and related platforms have changed the way we should be thinking about DevOps and...
SYS-CON Events announced today that Grape Up will exhibit at SYS-CON's 21st International Cloud Expo®, which will take place on Oct. 31 – Nov 2, 2017, at the Santa Clara Convention Center in Santa Clara, CA. Grape Up is a software company specializing in cloud native application development and professional services related to Cloud Foundry PaaS. With five expert teams that operate in various sectors of the market across the U.S. and Europe, Grape Up works with a variety of customers from emergi...
Everything run by electricity will eventually be connected to the Internet. Get ahead of the Internet of Things revolution and join Akvelon expert and IoT industry leader, Sergey Grebnov, in his session at @ThingsExpo, for an educational dive into the world of managing your home, workplace and all the devices they contain with the power of machine-based AI and intelligent Bot services for a completely streamlined experience.
Because IoT devices are deployed in mission-critical environments more than ever before, it’s increasingly imperative they be truly smart. IoT sensors simply stockpiling data isn’t useful. IoT must be artificially and naturally intelligent in order to provide more value In his session at @ThingsExpo, John Crupi, Vice President and Engineering System Architect at Greenwave Systems, will discuss how IoT artificial intelligence (AI) can be carried out via edge analytics and machine learning techn...
FinTechs use the cloud to operate at the speed and scale of digital financial activity, but are often hindered by the complexity of managing security and compliance in the cloud. In his session at 20th Cloud Expo, Sesh Murthy, co-founder and CTO of Cloud Raxak, showed how proactive and automated cloud security enables FinTechs to leverage the cloud to achieve their business goals. Through business-driven cloud security, FinTechs can speed time-to-market, diminish risk and costs, maintain continu...
When shopping for a new data processing platform for IoT solutions, many development teams want to be able to test-drive options before making a choice. Yet when evaluating an IoT solution, it’s simply not feasible to do so at scale with physical devices. Building a sensor simulator is the next best choice; however, generating a realistic simulation at very high TPS with ease of configurability is a formidable challenge. When dealing with multiple application or transport protocols, you would be...
With tough new regulations coming to Europe on data privacy in May 2018, Calligo will explain why in reality the effect is global and transforms how you consider critical data. EU GDPR fundamentally rewrites the rules for cloud, Big Data and IoT. In his session at 21st Cloud Expo, Adam Ryan, Vice President and General Manager EMEA at Calligo, will examine the regulations and provide insight on how it affects technology, challenges the established rules and will usher in new levels of diligence a...
Existing Big Data solutions are mainly focused on the discovery and analysis of data. The solutions are scalable and highly available but tedious when swapping in and swapping out occurs in disarray and thrashing takes place. The resolution for thrashing through machine learning algorithms and support nomenclature is through simple techniques. Organizations that have been collecting large customer data are increasingly seeing the need to use the data for swapping in and out and thrashing occurs ...
An increasing number of companies are creating products that combine data with analytical capabilities. Running interactive queries on Big Data requires complex architectures to store and query data effectively, typically involving data streams, an choosing efficient file format/database and multiple independent systems that are tied together through custom-engineered pipelines. In his session at @BigDataExpo at @ThingsExpo, Tomer Levi, a senior software engineer at Intel’s Advanced Analytics ...
As many know, the first generation of Cloud Management Platform (CMP) solutions were designed for managing virtual infrastructure (IaaS) and traditional applications. But that’s no longer enough to satisfy evolving and complex business requirements. In his session at 21st Cloud Expo, Scott Davis, Embotics CTO, will explore how next-generation CMPs ensure organizations can manage cloud-native and microservice-based application architectures, while also facilitating agile DevOps methodology. He wi...
Historically, some banking activities such as trading have been relying heavily on analytics and cutting edge algorithmic tools. The coming of age of powerful data analytics solutions combined with the development of intelligent algorithms have created new opportunities for financial institutions. In his session at 20th Cloud Expo, Sebastien Meunier, Head of Digital for North America at Chappuis Halder & Co., discussed how these tools can be leveraged to develop a lasting competitive advantage ...
SYS-CON Events announced today that Dasher Technologies will exhibit at SYS-CON's 21st International Cloud Expo®, which will take place on Oct 31 - Nov 2, 2017, at the Santa Clara Convention Center in Santa Clara, CA. Dasher Technologies, Inc. ® is a premier IT solution provider that delivers expert technical resources along with trusted account executives to architect and deliver complete IT solutions and services to help our clients execute their goals, plans and objectives. Since 1999, we'v...
There is only one world-class Cloud event on earth, and that is Cloud Expo – which returns to Silicon Valley for the 21st Cloud Expo at the Santa Clara Convention Center, October 31 - November 2, 2017. Every Global 2000 enterprise in the world is now integrating cloud computing in some form into its IT development and operations. Midsize and small businesses are also migrating to the cloud in increasing numbers. Companies are each developing their unique mix of cloud technologies and service...
For financial firms, the cloud is going to increasingly become a crucial part of dealing with customers over the next five years and beyond, particularly with the growing use and acceptance of virtual currencies. There are new data storage paradigms on the horizon that will deliver secure solutions for storing and moving sensitive financial data around the world without touching terrestrial networks. In his session at 20th Cloud Expo, Cliff Beek, President of Cloud Constellation Corporation, d...