Welcome!

Cloud Expo Authors: Elizabeth White, Roger Strukhoff, Pat Romanski, Raja Patel, Lori MacVittie

Related Topics: SOA & WOA, Java, Wireless, Web 2.0, Open Web

SOA & WOA: Article

Best Practices for Load Testing Mobile Applications | Part I

The differences between testing traditional web and mobile applications

Mobile applications and mobile websites have become a major channel for conducting business, improving employee efficiency, communicating, and reaching consumers. In the past, mobile played a smaller role in business applications, so performance issues and outages were less of a concern. This is no longer the case. Today, performance problems with mobile applications lead directly to revenue loss, brand damage, and diminished employee productivity. Part I of this article discusses the differences between testing traditional web and mobile applications, specific challenges associated with mobile load testing, mobile testing basics and best practices for recording mobile load test scenarios. In Part II, we will look at how to conduct realistic tests and how to best analyze the results.

Application developers have long understood the need for load testing conventional desktop web applications to ensure that they will behave properly under load with the expected number of users. With the advent of mobile apps and mobile websites the principles of load testing have not changed. There are, however, challenges specific to mobile load testing that must be addressed by your load testing solution.

Since mobile apps and applications for desktop web browsers use the same underlying technologies, the good news is that most load testing tasks and challenges are the same. This means that you don't necessarily need a brand new, mobile-specific load testing tool, but you do need a quality web load testing tool capable of handling the nuances of load testing mobile apps. Using a tool that enables testing of traditional and mobile web applications enables you to leverage existing in-house skills for designing and parameterizing your scripts, running your tests, and analyzing the results.

Aside from the similarities between traditional and mobile load testing, there are three key differences:

  • Simulating network and bandwidth for wireless protocols: With 3G wireless protocols, mobile devices typically connect to the Internet using a slower, lower quality connection than desktops and laptops. This has an effect on response times on the client side and on the server, which you'll need to account for as you define your tests and analyze your results. Additionally, latency and packet loss becomes more of a factor with mobile applications and needs to be considered.
  • Recording on mobile devices: Obviously, mobile apps run on mobile devices, and this can make it difficult to record test scenarios, particularly for secured applications that use HTTPS.
  • Supporting a wide range of devices: The many different kinds of mobile devices on the market have led web application designers to tailor content based on the capabilities of the client's platform. This presents challenges for recording and playing back test scenarios.

Mobile Load Testing Basics
As you may know, a typical automated functional test for a mobile application emulates user actions (including tap, swipe, zoom, and entering text) on a real device or an emulator. The objective of load testing, however, is not to test the functionality of the application for just a single user. Rather, the goal is to see how the server infrastructure performs when handling requests from a large number of users, and to understand how response times are affected by other users interacting with the application.

An effective load test simulates a high volume of simultaneous users accessing your server via your application. Using real devices or emulators for this task is impractical because it demands acquiring, configuring, and synchronizing hundreds or thousands of real devices or machines running emulators.

The solution, of course, is to use a load testing approach that is designed to scale as needed. With a client-based approach, user actions in the browser or the native application are recorded and played back. In contrast, a protocol-based approach involves recording and reproducing the network traffic between the device and the server. To verify performance under large loads, tools that enable protocol-based testing are superior to those that support only client-based testing because they can scale up to millions of users while checking errors and response times for each user.

The basic process for protocol-based mobile load testing is:

  1. Record the network traffic between the device and the server
  2. Replay the network requests for a large number of virtual users
  3. Analyze the results

It appears straightforward, but there are challenges at every step. The good news is that these challenges can be addressed with an effective load testing approach.

Recording Mobile Load Testing Scenarios
To generate a mobile test scenario, you first need to identify the type of mobile application under test. Challenges associated with capturing the data exchanges between a mobile application and the server depend on the design of the application:

  • Native apps - These apps are coded using a programming language (Objective-C for iOS, Java for Android) and API that is specific to the device. As such, they are tied to a mobile platform and are installed from an online store or market.
  • Web apps - Built with web technologies (such as HTML and JavaScript), these applications can be accessed from any mobile browser. More sophisticated web apps may use advanced features like geolocation or web storage for data or include customizations to better match the browser used. Two popular web apps are http://touch.linkedin.com and http://m.gmail.com.
  • Hybrid Apps - A web app embedded in a native app is known as a hybrid app. The native part of the application is limited to a few user interface elements like the menu or navigation buttons, and functions such as automatic login. The main content is displayed in an embedded web browser component. The Facebook application, installed from an online store or a market is a typical sample.

Recording Tests for Native Apps
Because native apps run on your device or within an emulator, to record a test you need to intercept the network traffic coming from the real device or the emulator.

To intercept this traffic, the equipment that records the traffic must be connected to the same network as the device. When the recording computer is on the intranet behind a firewall, it is not possible to record a mobile device connected via a 3G or 4G wireless network. The device and the computer running the recorder must be connected to the same Wi-Fi network.

Most load testing tools provide a proxy based recorder, which is the easiest way to record an application's network traffic. To use this approach, you need to configure the mobile device's Wi-Fi settings so that the traffic goes through the recording proxy. Some mobile operating systems, such as iOS and Android 4, support making this change, but older versions of Android may not. Moreover, some applications connect directly to the server regardless of the proxy settings of the operating system. In either of these cases, you need a tool that provides an alternative to proxy-based recording methods based on network capture or tunneling.

Note: You can use the following simple test to check if the application can be recorded using a proxy. First, configure the proxy settings on the device and record your interactions with any website in a mobile browser. Then, try to record interactions in the native application. If your testing tool successfully records the browser generated traffic, but does not record traffic generated by the native application then you can conclude that the native application is bypassing the proxy settings and that an alternative recording method is required.

Recording Tests for Web Apps and Mobile Version of Websites
Web apps use the same web technologies as modern desktop browsers. As a result, you can record the application or the mobile version of a website from a modern browser on your regular desktop computer, which is an easier and faster alternative to recording from the device.

Many web applications check the browser and platform used to access them. This enables the application, when accessed from a mobile device, to redirect to a mobile version of the content that may contain less text or fewer images. To test such an app from the desktop, you need to modify the requests to make them appear to the server to be coming from a mobile device. Otherwise, you will not be testing the mobile version of the application as the server may redirect to a desktop version. Some browser plugins provide the ability to alter the identity of the browser (by modifying the User-Agent header of requests). Support for this feature is also directly integrated in the recorder of advanced load testing tools.

Modifying the browser's identity is not always enough. You obviously cannot use this approach to transform Internet Explorer 6 into an HTML5 compatible browser. The browser you use on the desktop must be able to parse and render content created for mobile browsers, so it's best to record with a modern browser like Internet Explorer 9, Firefox 5, Chrome 15, or Safari 5 (or a more recent version of any of these if available). If the application includes WebKit specific features, you should use a WebKit based desktop browser, preferably either Chrome or Safari.

Recording Tests for Hybrid Apps
Obviously, tests for native apps cannot be recorded using a desktop browser. However, tests for many hybrid apps can. You may be able to directly access the URL used for the application, for example http://m.facebook.com for the Facebook application, and record your tests as you would for a classic web app.

Recording Tests for Secured Native Applications
There are additional challenges to consider when recording tests for a secured native application, that is, an application that uses HTTPS for the login procedure or any other processing.

By default, all HTTPS recording methods, whether proxy or tunnel based, are seen as man-in-the-middle attacks by the device. This raises a non-blocking alert in a desktop or mobile browser but it leads to an outright connection refusal in native applications, making it impossible to record the secured traffic.

The only way to record tests for secured native applications is to provide a root certificate that authorizes the connection with the proxy or tunnel. While this feature is currently supported by relatively few load testing solutions, it is essential for load testing any native application that relies on HTTPS.

Note: The root certificate must be installed on the device. This operation is simple for iOS devices; you can simply send the certificate via email and open the attachment on the device. For other platforms, including Android, the procedure is not as straightforward and may depend on the version of the operating system and the manufacturer of the device.

Running Realistic Tests
Once you've recorded a test scenario, you need to be parameterize it so that it can emulate users with different identities and behaviors as it is played back to produce a realistic load on the server. This step is required for traditional and mobile web applications, and the tools used to complete it are the same. When playing back the test scenarios, however, there are several challenges specific to mobile load testing and we will discuss this more in Part II of this article on "Best Practices for Load Testing Mobile Applications."

More Stories By Steve Weisfeldt

Steve Weisfeldt is a Senior Performance Engineer at Neotys, a provider of load testing software for Web applications. Previously, he has worked as the President of Engine 1 Consulting, a services firm specializing in all facets of test automation. Prior to his involvement at Engine 1 Consulting, he was a Senior Systems Engineer at Aternity. Prior to that, Steve spent seven years at automated testing vendor Segue Software (acquired by Borland). While spending most of his time at Segue delivering professional services and training, he was also involved in pre-sales and product marketing efforts.

Being in the load and performance testing space since 1999, Steve has been involved in load and performance testing projects of all sizes, in industries that span the retail, financial services, insurance and manufacturing sectors. His expertise lies in enabling organizations to optimize their ability to develop, test and launch high-quality applications efficiently, on-time and on-budget. Steve graduated from the University of Massachusetts-Lowell with a BS in Electrical Engineering and an MS in Computer Engineering.

Comments (0)

Share your thoughts on this story.

Add your comment
You must be signed in to add a comment. Sign-in | Register

In accordance with our Comment Policy, we encourage comments that are on topic, relevant and to-the-point. We will remove comments that include profanity, personal attacks, racial slurs, threats of violence, or other inappropriate material that violates our Terms and Conditions, and will block users who make repeated violations. We ask all readers to expect diversity of opinion and to treat one another with dignity and respect.


Cloud Expo Latest Stories
Hardware will never be more valuable than on the day it hits your loading dock. Each day new servers are not deployed to production the business is losing money. While Moore’s Law is typically cited to explain the exponential density growth of chips, a critical consequence of this is rapid depreciation of servers. The hardware for clustered systems (e.g., Hadoop, OpenStack) tends to be significant capital expenses. In his session at 15th Cloud Expo, Mason Katz, CTO and co-founder of StackIQ, to discuss how infrastructure teams should be aware of the capitalization and depreciation model of these expenses to fully understand when and where automation is critical.
Over the last few years the healthcare ecosystem has revolved around innovations in Electronic Health Record (HER) based systems. This evolution has helped us achieve much desired interoperability. Now the focus is shifting to other equally important aspects – scalability and performance. While applying cloud computing environments to the EHR systems, a special consideration needs to be given to the cloud enablement of Veterans Health Information Systems and Technology Architecture (VistA), i.e., the largest single medical system in the United States.
In his session at 15th Cloud Expo, Mark Hinkle, Senior Director, Open Source Solutions at Citrix Systems Inc., will provide overview of the open source software that can be used to deploy and manage a cloud computing environment. He will include information on storage, networking(e.g., OpenDaylight) and compute virtualization (Xen, KVM, LXC) and the orchestration(Apache CloudStack, OpenStack) of the three to build their own cloud services. Speaker Bio: Mark Hinkle is the Senior Director, Open Source Solutions, at Citrix Systems Inc. He joined Citrix as a result of their July 2011 acquisition of Cloud.com where he was their Vice President of Community. He is currently responsible for Citrix open source efforts around the open source cloud computing platform, Apache CloudStack and the Xen Hypervisor. Previously he was the VP of Community at Zenoss Inc., a producer of the open source application, server, and network management software, where he grew the Zenoss Core project to over 10...
Most of today’s hardware manufacturers are building servers with at least one SATA Port, but not every systems engineer utilizes them. This is considered a loss in the game of maximizing potential storage space in a fixed unit. The SATADOM Series was created by Innodisk as a high-performance, small form factor boot drive with low power consumption to be plugged into the unused SATA port on your server board as an alternative to hard drive or USB boot-up. Built for 1U systems, this powerful device is smaller than a one dollar coin, and frees up otherwise dead space on your motherboard. To meet the requirements of tomorrow’s cloud hardware, Innodisk invested internal R&D resources to develop our SATA III series of products. The SATA III SATADOM boasts 500/180MBs R/W Speeds respectively, or double R/W Speed of SATA II products.
14th International Cloud Expo, held on June 10–12, 2014 at the Javits Center in New York City, featured three content-packed days with a rich array of sessions about the business and technical value of cloud computing, Internet of Things, Big Data, and DevOps led by exceptional speakers from every sector of the IT ecosystem. The Cloud Expo series is the fastest-growing Enterprise IT event in the past 10 years, devoted to every aspect of delivering massively scalable enterprise IT as a service.
As more applications and services move "to the cloud" (public or on-premise) cloud environments are increasingly adopting and building out traditional enterprise features. This in turn is enabling and encouraging cloud adoption from enterprise users. In many ways the definition is blurring as features like continuous operation, geo-distribution or on-demand capacity become the norm. NuoDB is involved in both building enterprise software and using enterprise cloud capabilities. In his session at 15th Cloud Expo, Seth Proctor, CTO at NuoDB, Inc., will discuss the experiences from building, deploying and using enterprise services and suggest some ways to approach moving enterprise applications into a cloud model.
Until recently, many organizations required specialized departments to perform mapping and geospatial analysis, and they used Esri on-premise solutions for that work. In his session at 15th Cloud Expo, Dave Peters, author of the Esri Press book Building a GIS, System Architecture Design Strategies for Managers, will discuss how Esri has successfully included the cloud as a fully integrated SaaS expansion of the ArcGIS mapping platform. Organizations that have incorporated Esri cloud-based applications and content within their business models are reaping huge benefits by directly leveraging cloud-based mapping and analysis capabilities within their existing enterprise investments. The ArcGIS mapping platform includes cloud-based content management and information resources to more widely, efficiently, and affordably deliver real-time actionable information and analysis capabilities to your organization.
Almost everyone sees the potential of Internet of Things but how can businesses truly unlock that potential. The key will be in the ability to discover business insight in the midst of an ocean of Big Data generated from billions of embedded devices via Systems of Discover. Businesses will also need to ensure that they can sustain that insight by leveraging the cloud for global reach, scale and elasticity. In his session at Internet of @ThingsExpo, Mac Devine, Distinguished Engineer at IBM, will discuss bringing these three elements together via Systems of Discover.
Cloud and Big Data present unique dilemmas: embracing the benefits of these new technologies while maintaining the security of your organization’s assets. When an outside party owns, controls and manages your infrastructure and computational resources, how can you be assured that sensitive data remains private and secure? How do you best protect data in mixed use cloud and big data infrastructure sets? Can you still satisfy the full range of reporting, compliance and regulatory requirements? In his session at 15th Cloud Expo, Derek Tumulak, Vice President of Product Management at Vormetric, will discuss how to address data security in cloud and Big Data environments so that your organization isn’t next week’s data breach headline.
The cloud is everywhere and growing, and with it SaaS has become an accepted means for software delivery. SaaS is more than just a technology, it is a thriving business model estimated to be worth around $53 billion dollars by 2015, according to IDC. The question is – how do you build and scale a profitable SaaS business model? In his session at 15th Cloud Expo, Jason Cumberland, Vice President, SaaS Solutions at Dimension Data, will give the audience an understanding of common mistakes businesses make when transitioning to SaaS; how to avoid them; and how to build a profitable and scalable SaaS business.
SYS-CON Events announced today that Gridstore™, the leader in software-defined storage (SDS) purpose-built for Windows Servers and Hyper-V, will exhibit at SYS-CON's 15th International Cloud Expo®, which will take place on November 4–6, 2014, at the Santa Clara Convention Center in Santa Clara, CA. Gridstore™ is the leader in software-defined storage purpose built for virtualization that is designed to accelerate applications in virtualized environments. Using its patented Server-Side Virtual Controller™ Technology (SVCT) to eliminate the I/O blender effect and accelerate applications Gridstore delivers vmOptimized™ Storage that self-optimizes to each application or VM across both virtual and physical environments. Leveraging a grid architecture, Gridstore delivers the first end-to-end storage QoS to ensure the most important App or VM performance is never compromised. The storage grid, that uses Gridstore’s performance optimized nodes or capacity optimized nodes, starts with as few a...
SYS-CON Events announced today that Solgenia, the global market leader in Cloud Collaboration and Cloud Infrastructure software solutions, will exhibit at SYS-CON's 15th International Cloud Expo®, which will take place on November 4–6, 2014, at the Santa Clara Convention Center in Santa Clara, CA. Solgenia is the global market leader in Cloud Collaboration and Cloud Infrastructure software solutions. Designed to “Bridge the Gap” between personal and professional social, mobile and cloud user experiences, our solutions help large and medium-sized organizations dramatically improve productivity, reduce collaboration costs, and increase the overall enterprise value by bringing collaboration and infrastructure solutions to the cloud.
Cloud computing started a technology revolution; now DevOps is driving that revolution forward. By enabling new approaches to service delivery, cloud and DevOps together are delivering even greater speed, agility, and efficiency. No wonder leading innovators are adopting DevOps and cloud together! In his session at DevOps Summit, Andi Mann, Vice President of Strategic Solutions at CA Technologies, will explore the synergies in these two approaches, with practical tips, techniques, research data, war stories, case studies, and recommendations.
Enterprises require the performance, agility and on-demand access of the public cloud, and the management, security and compatibility of the private cloud. The solution? In his session at 15th Cloud Expo, Simone Brunozzi, VP and Chief Technologist(global role) for VMware, will explore how to unlock the power of the hybrid cloud and the steps to get there. He'll discuss the challenges that conventional approaches to both public and private cloud computing, and outline the tough decisions that must be made to accelerate the journey to the hybrid cloud. As part of the transition, an Infrastructure-as-a-Service model will enable enterprise IT to build services beyond their data center while owning what gets moved, when to move it, and for how long. IT can then move forward on what matters most to the organization that it supports – availability, agility and efficiency.
Every healthy ecosystem is diverse. This is especially true in cloud ecosystems, where portability and interoperability are more important than old enterprise models of proprietary ownership. In his session at 15th Cloud Expo, Mark Baker, Server Product Manager at Canonical/Ubuntu, will discuss how single vendors used to take the lead in creating and delivering technology, but in a cloud economy, where users want tools of their preference, when and where they need them, it makes no sense.