Welcome!

@CloudExpo Authors: Liz McMillan, Pat Romanski, Elizabeth White, Jason Bloomberg, Leon Adato

Related Topics: @CloudExpo, Java IoT, Microservices Expo, Microsoft Cloud, Containers Expo Blog

@CloudExpo: Article

Hip? Or Hype? Load and Performance Testing from the Cloud

Some common approaches in choosing a cloud solution

Assuming you haven't spent the last couple of years living under a rock, you're bound to have been bombarded with all sorts of propaganda about "The Cloud." "The Cloud," according to the marketing types, is the greatest thing since the invention of bread, surely able to solve all of our needs, whether technology-related or not. While the hype for the cloud might be frequently and frustratingly overstated and confusingly applied in odd places (can someone please explain to me the Microsoft Cloud commercial where the woman goes "to the cloud!!" so she can generate a family photo? What does this have to do with the cloud? Isn't this just Photoshop?), I'd like to discuss one place where the cloud adds a great deal of value: Load and Performance testing.

In this article, I'm going to talk about the benefits of using the cloud as part of your load and performance testing practices, as well as point out some common approaches in choosing a cloud solution.

In a future article, I'll discuss some of the challenges the cloud presents and some best practices to deal with these challenges.

For those of you who have ever heard me present on load testing, you are well aware of how fixated I am on the need for load testing that truly emulates the real world behavior of your end users. And for those that haven't, understand that, in my opinion, if you are not doing accurate and realistic load testing, you might as well stop now. Inaccurate load testing, in my view, is often worse than doing no testing at all. I've seen far too many organizations perform inaccurate testing, that is testing that falls short of mirroring what their end users are really doing, and use results of this testing to lull themselves into a false sense of security. Then they go live with their web application, which in turn experiences end-user behavior that wasn't fully tested for, and are surprised when their web application falls down.

In my mind, The Cloud immediately brings two distinct advantages to our load and performance procedures that help us better model this realistic behavior. The first is what I call "Instant Infrastructure."

In today's economy, customer-facing web applications are experiencing vastly increased user loads. These loads might increase consistently over time due the success of your business, or they might be more sporadic in nature, perhaps due to a seasonal sales or new advertising campaign. As we know, if your application is unable to handle this increased load - whether the application experiences poor performance or downtime (or both) - you run the risk of losing business or experiencing damage to your brand.

To avoid these potential problems, performance testing should include load tests with very large user loads. Unfortunately, trying to maintain a test environment that generates this kind of load can be both costly and challenging. Creating an environment of this size can typically require tens or even hundreds of machines. Purchasing and configuring these systems requires a significant investment of time and money. After the machines have been acquired, configured and used for the immediate load testing need, they may end up sitting unused for long stretches until they are needed for the next large-scale load testing project. Not a very cost-effective approach in a lot of situations.

Here's where the "Instant Infrastructure" comes in. By accessing a test infrastructure in the cloud, you can rapidly access as many load generating machines as you need, on demand. With this approach there is no need to spend weeks setting up and configuring dozens of real machines. The cloud testing provider will automate this process as well as keep all the machines updated. This obviously saves you from the substantial up-front costs of purchasing and maintaining the load-generating machines. Most cloud testing providers also use a pay-as-you-go model, for which you rapidly access the testing infrastructure you need, when you need it, and only for as long as you need it. From a business standpoint, the cloud lowers total cost of ownership, while increasing flexibility.

The second key way the Cloud helps us achieve more realistic load tests is its ability to allow testing from disparate locations. In many cases your real-world users are not sitting inside your firewall; instead they are probably accessing your location from their office, their home - locations around the country or around the globe. As such, for performance testing to be accurate it must include load generated from these locations as well. If your testing only uses load-generating machines inside your firewall, you're not testing the entire delivery chain. With the cloud, you can execute load tests that access your web application just as your users will - from outside of your firewall - and validate all components of the delivery chain, including the firewall, DNS, network equipment, and ISP. These tests are not only more realistic, but they also allow you to understand the impact of third-party components, such as content delivery networks, analytics servers, and ad servers. And you know how I feel about realistic tests.

Now that I've discussed the advantages that the cloud can bring to your load testing, I'd like to talk a bit about some of the limitations that relying solely on a cloud solution might present. Like any other part of our testing effort, understanding the tools we have at our disposal and how to use them is the key to being successful. Just because we have some great tools in our toolbox doesn't necessarily mean they will help us everywhere, especially if they aren't used correctly. The cloud is no exception. First of all, load testing from the cloud should not replace internal testing; that is, testing with load generated from within the firewall. Too often, I've seen organizations complete their load testing scripting and then move right away to running large-scale tests with the cloud. To me, this approach misses a key step. Load testing cycles should include a combination of tests run from both inside and outside the firewall. You may be wondering if I'm contradicting myself - based on my earlier cloud discussions, the cloud seems like the way to go with all of our load testing, doesn't it? Why would I want to do anything else? There are a couple of reasons why "inside the firewall" testing should still be considered.

First, as we've already seen, driving the test load from cloud load generators allows us to test the whole delivery chain. So our performance measures will include not only the performance of the application but of also all the networking pieces that exist between the application and the end user. Running a load test from inside the firewall will provide us with performance counters of really just the application itself. Combining the results of these two tests can yield some very powerful analysis. If we correlate the same measures gathered from "inside the firewall" tests with those gathered from tests run from the cloud, we can get a much better sense of the impact the delivery chain itself has on the measured performance. Without these combined tests, it can be very difficult, if not impossible, to understand the true root cause of performance issues - whether they are impacted by things under our control (inside the firewall) or those we may not have control over (outside the firewall).

As you can see, the cloud is opening new opportunities to improve the scale and realism of load testing as well as saving time and lowering costs. But, in order to realize these benefits of the cloud, a proper approach is required. Part of this approach is the actual selection of a cloud load testing solution. When selecting a cloud testing solution, keep in mind that it's not just about moving to the cloud itself but also making sure you choose the right solution. In my next article, I will discuss some ideas of how to best approach the selection of a cloud testing solution.

More Stories By Steve Weisfeldt

Steve Weisfeldt is a Senior Performance Engineer at Neotys, a provider of load testing software for Web applications. Previously, he has worked as the President of Engine 1 Consulting, a services firm specializing in all facets of test automation. Prior to his involvement at Engine 1 Consulting, he was a Senior Systems Engineer at Aternity. Prior to that, Steve spent seven years at automated testing vendor Segue Software (acquired by Borland). While spending most of his time at Segue delivering professional services and training, he was also involved in pre-sales and product marketing efforts.

Being in the load and performance testing space since 1999, Steve has been involved in load and performance testing projects of all sizes, in industries that span the retail, financial services, insurance and manufacturing sectors. His expertise lies in enabling organizations to optimize their ability to develop, test and launch high-quality applications efficiently, on-time and on-budget. Steve graduated from the University of Massachusetts-Lowell with a BS in Electrical Engineering and an MS in Computer Engineering.

Comments (0)

Share your thoughts on this story.

Add your comment
You must be signed in to add a comment. Sign-in | Register

In accordance with our Comment Policy, we encourage comments that are on topic, relevant and to-the-point. We will remove comments that include profanity, personal attacks, racial slurs, threats of violence, or other inappropriate material that violates our Terms and Conditions, and will block users who make repeated violations. We ask all readers to expect diversity of opinion and to treat one another with dignity and respect.


@CloudExpo Stories
Enterprise architects are increasingly adopting multi-cloud strategies as they seek to utilize existing data center assets, leverage the advantages of cloud computing and avoid cloud vendor lock-in. This requires a globally aware traffic management strategy that can monitor infrastructure health across data centers and end-user experience globally, while responding to control changes and system specification at the speed of today’s DevOps teams. In his session at 20th Cloud Expo, Josh Gray, Chie...
"We're a cybersecurity firm that specializes in engineering security solutions both at the software and hardware level. Security cannot be an after-the-fact afterthought, which is what it's become," stated Richard Blech, Chief Executive Officer at Secure Channels, in this SYS-CON.tv interview at @ThingsExpo, held November 1-3, 2016, at the Santa Clara Convention Center in Santa Clara, CA.
In his session at 20th Cloud Expo, Mike Johnston, an infrastructure engineer at Supergiant.io, discussed how to use Kubernetes to set up a SaaS infrastructure for your business. Mike Johnston is an infrastructure engineer at Supergiant.io with over 12 years of experience designing, deploying, and maintaining server and workstation infrastructure at all scales. He has experience with brick and mortar data centers as well as cloud providers like Digital Ocean, Amazon Web Services, and Rackspace. H...
You know you need the cloud, but you’re hesitant to simply dump everything at Amazon since you know that not all workloads are suitable for cloud. You know that you want the kind of ease of use and scalability that you get with public cloud, but your applications are architected in a way that makes the public cloud a non-starter. You’re looking at private cloud solutions based on hyperconverged infrastructure, but you’re concerned with the limits inherent in those technologies.
SYS-CON Events announced today that Massive Networks will exhibit at SYS-CON's 21st International Cloud Expo®, which will take place on Oct 31 – Nov 2, 2017, at the Santa Clara Convention Center in Santa Clara, CA. Massive Networks mission is simple. To help your business operate seamlessly with fast, reliable, and secure internet and network solutions. Improve your customer's experience with outstanding connections to your cloud.
DevOps is under attack because developers don’t want to mess with infrastructure. They will happily own their code into production, but want to use platforms instead of raw automation. That’s changing the landscape that we understand as DevOps with both architecture concepts (CloudNative) and process redefinition (SRE). Rob Hirschfeld’s recent work in Kubernetes operations has led to the conclusion that containers and related platforms have changed the way we should be thinking about DevOps and...
SYS-CON Events announced today that SkyScale will exhibit at SYS-CON's 21st International Cloud Expo®, which will take place on Oct 31 – Nov 2, 2017, at the Santa Clara Convention Center in Santa Clara, CA. SkyScale is a world-class provider of cloud-based, ultra-fast multi-GPU hardware platforms for lease to customers desiring the fastest performance available as a service anywhere in the world. SkyScale builds, configures, and manages dedicated systems strategically located in maximum-security...
SYS-CON Events announced today that Grape Up will exhibit at SYS-CON's 21st International Cloud Expo®, which will take place on Oct. 31 – Nov 2, 2017, at the Santa Clara Convention Center in Santa Clara, CA. Grape Up is a software company specializing in cloud native application development and professional services related to Cloud Foundry PaaS. With five expert teams that operate in various sectors of the market across the U.S. and Europe, Grape Up works with a variety of customers from emergi...
Everything run by electricity will eventually be connected to the Internet. Get ahead of the Internet of Things revolution and join Akvelon expert and IoT industry leader, Sergey Grebnov, in his session at @ThingsExpo, for an educational dive into the world of managing your home, workplace and all the devices they contain with the power of machine-based AI and intelligent Bot services for a completely streamlined experience.
Because IoT devices are deployed in mission-critical environments more than ever before, it’s increasingly imperative they be truly smart. IoT sensors simply stockpiling data isn’t useful. IoT must be artificially and naturally intelligent in order to provide more value In his session at @ThingsExpo, John Crupi, Vice President and Engineering System Architect at Greenwave Systems, will discuss how IoT artificial intelligence (AI) can be carried out via edge analytics and machine learning techn...
With tough new regulations coming to Europe on data privacy in May 2018, Calligo will explain why in reality the effect is global and transforms how you consider critical data. EU GDPR fundamentally rewrites the rules for cloud, Big Data and IoT. In his session at 21st Cloud Expo, Adam Ryan, Vice President and General Manager EMEA at Calligo, will examine the regulations and provide insight on how it affects technology, challenges the established rules and will usher in new levels of diligence a...
FinTechs use the cloud to operate at the speed and scale of digital financial activity, but are often hindered by the complexity of managing security and compliance in the cloud. In his session at 20th Cloud Expo, Sesh Murthy, co-founder and CTO of Cloud Raxak, showed how proactive and automated cloud security enables FinTechs to leverage the cloud to achieve their business goals. Through business-driven cloud security, FinTechs can speed time-to-market, diminish risk and costs, maintain continu...
When shopping for a new data processing platform for IoT solutions, many development teams want to be able to test-drive options before making a choice. Yet when evaluating an IoT solution, it’s simply not feasible to do so at scale with physical devices. Building a sensor simulator is the next best choice; however, generating a realistic simulation at very high TPS with ease of configurability is a formidable challenge. When dealing with multiple application or transport protocols, you would be...
An increasing number of companies are creating products that combine data with analytical capabilities. Running interactive queries on Big Data requires complex architectures to store and query data effectively, typically involving data streams, an choosing efficient file format/database and multiple independent systems that are tied together through custom-engineered pipelines. In his session at @BigDataExpo at @ThingsExpo, Tomer Levi, a senior software engineer at Intel’s Advanced Analytics ...
Existing Big Data solutions are mainly focused on the discovery and analysis of data. The solutions are scalable and highly available but tedious when swapping in and swapping out occurs in disarray and thrashing takes place. The resolution for thrashing through machine learning algorithms and support nomenclature is through simple techniques. Organizations that have been collecting large customer data are increasingly seeing the need to use the data for swapping in and out and thrashing occurs ...
As many know, the first generation of Cloud Management Platform (CMP) solutions were designed for managing virtual infrastructure (IaaS) and traditional applications. But that’s no longer enough to satisfy evolving and complex business requirements. In his session at 21st Cloud Expo, Scott Davis, Embotics CTO, will explore how next-generation CMPs ensure organizations can manage cloud-native and microservice-based application architectures, while also facilitating agile DevOps methodology. He wi...
Historically, some banking activities such as trading have been relying heavily on analytics and cutting edge algorithmic tools. The coming of age of powerful data analytics solutions combined with the development of intelligent algorithms have created new opportunities for financial institutions. In his session at 20th Cloud Expo, Sebastien Meunier, Head of Digital for North America at Chappuis Halder & Co., discussed how these tools can be leveraged to develop a lasting competitive advantage ...
SYS-CON Events announced today that Dasher Technologies will exhibit at SYS-CON's 21st International Cloud Expo®, which will take place on Oct 31 - Nov 2, 2017, at the Santa Clara Convention Center in Santa Clara, CA. Dasher Technologies, Inc. ® is a premier IT solution provider that delivers expert technical resources along with trusted account executives to architect and deliver complete IT solutions and services to help our clients execute their goals, plans and objectives. Since 1999, we'v...
There is only one world-class Cloud event on earth, and that is Cloud Expo – which returns to Silicon Valley for the 21st Cloud Expo at the Santa Clara Convention Center, October 31 - November 2, 2017. Every Global 2000 enterprise in the world is now integrating cloud computing in some form into its IT development and operations. Midsize and small businesses are also migrating to the cloud in increasing numbers. Companies are each developing their unique mix of cloud technologies and service...
For financial firms, the cloud is going to increasingly become a crucial part of dealing with customers over the next five years and beyond, particularly with the growing use and acceptance of virtual currencies. There are new data storage paradigms on the horizon that will deliver secure solutions for storing and moving sensitive financial data around the world without touching terrestrial networks. In his session at 20th Cloud Expo, Cliff Beek, President of Cloud Constellation Corporation, d...