Welcome!

@CloudExpo Authors: Yeshim Deniz, Liz McMillan, Elizabeth White, Zakia Bouachraoui, Pat Romanski

Related Topics: @CloudExpo, Java IoT, Microservices Expo, Machine Learning , Agile Computing, Release Management , @DXWorldExpo

@CloudExpo: Article

How to Performance Test Automation for GWT and SmartGWT

The next “evolutionary” step is to monitor performance for every end user

This article is based on the experience of Jan Swaelens, Software Architect at Sofico. He is responsible for automatic performance testing of the company's new web platform based on GWT and SmartGWT. Sofico is specialized in software solutions for automotive finance, leasing, fleet and mobility management companies.

Choosing GWT and SmartGWT over Other Technologies
About two years ago Sofico started a project to replace its rich desktop application (built with PowerBuilder) with a browseribased rich Internet application. The developers selected GWT and SmartGWT as core technologies to leverage their in-house Java expertise because they believed in the potential of what these (fairly) new technologies had to offer. Their goal was to replace the existing desktop client with a new one that ran in a browser. Their eyes where set on a better user experience and high degree of customization possibilities to give their customers the flexibility and adaptability that they need to run their businesses.

Need End-to-End Visibility into GWT Black Box
GWT was a great choice as they could soon deliver the first basic version. The problems started when trying to figure out what was actually going on in these frameworks in order to analyze performance problems reported by the first testers.

Developers started off by using the "usual suspects" - browser-specific Dev Tools for Chrome, Firefox and IE. Back then, the built-in tools lacked first class JavaScript performance analysis capabilities which made it difficult to analyze a complex browser application. Additionally, there were no integration capabilities into server-side performance analysis tools such as JProfiler which would allow them to analyze the impact and correlation between server-side and client-side GWT code. Taking performance seriously, the performance automation team came up with some key requirements for additional tooling and process support.

Requirement #1: Browser to Database Visibility to "understand" what's going on
Do you know what really happens when a page of a GWT application is loaded? No?! Neither did the developers from Sofico. Getting insight into the "Black Box" was therefore the first requirement because they wanted to understand: what really happens in the browser, how many resources are downloaded from the web server, which transactions make it to the app server, what requests are cached, where is it cached and how the business logic and data access layer implementation impacts end user experience.

The following screenshots show the current implementation using dynaTrace (sign up for the free trial), which gives the developers full visibility from the browser to the web, app and database server. The Transaction Flow visualizes how individual requests or page loads and services by the different application tiers are processed.

End-to-End Visibility gave the developers more insight into how their GWT Application really works and what happens when pages are loaded or users interact with certain features.

A great view for front-end developers is the timeline view that shows what happens in a browser when a page gets loaded, when a user clicks a button that executes AJAX Requests, or when backend JavaScript continuously updates the page. It gives insight into performance problems of JavaScript code, inefficient use of resources (JS, CSS, Images...) and highlights whether certain requests just take a very long time on the server-side implementation:

Developers love the timeline view as it is easy to see what work is done by the browser, where performance hotspots are and even provides screenshots at certain events

To read more about additional requirements, please click here for the full article.

Requirement #2: JavaScript Performance Data to Optimize Framework Usage

Requirement #3: Correlated Server-Side Performance Data

Requirement #4: Automation, Automation, Automation

Next Step: Real User Monitoring
Giving developers the tools they need to build optimized and fast websites is great. Having a test framework that automatically verifies that performance metrics are always met is even better. Ultimately you also want to monitor performance of your real end users. The next "evolutionary" step therefore is to monitor performance for every end user, from all different geographical regions and all browsers they use. The following shows a dashboard that provides a high level analytics view of actual users. In case there are problems from specific regions, browser types, or specific web site features, you can drill down to the JavaScript error, long running method, problematic SQL Statement or thrown Exception.

After test automation comes production: You want to make sure to also monitor your real users and catch problems not found in testing

Read more and test it yourself

If you want to analyze your web site - whether it is implemented in GWT or any other Java, .NET or PHP Framework sign up for the dynaTrace Free Trial (click on try dynaTrace for free) and get 15 days full featured access to the product.

Also - here are some additional blogs you might be interested in

If you happen to be a Compuware APM/dynaTrace customer also check out the Test Automation features of dynaTrace on our APM Community Portal: Test Automation Video

More Stories By Andreas Grabner

Andreas Grabner has been helping companies improve their application performance for 15+ years. He is a regular contributor within Web Performance and DevOps communities and a prolific speaker at user groups and conferences around the world. Reach him at @grabnerandi

Comments (0)

Share your thoughts on this story.

Add your comment
You must be signed in to add a comment. Sign-in | Register

In accordance with our Comment Policy, we encourage comments that are on topic, relevant and to-the-point. We will remove comments that include profanity, personal attacks, racial slurs, threats of violence, or other inappropriate material that violates our Terms and Conditions, and will block users who make repeated violations. We ask all readers to expect diversity of opinion and to treat one another with dignity and respect.


CloudEXPO Stories
The now mainstream platform changes stemming from the first Internet boom brought many changes but didn’t really change the basic relationship between servers and the applications running on them. In fact, that was sort of the point. In his session at 18th Cloud Expo, Gordon Haff, senior cloud strategy marketing and evangelism manager at Red Hat, will discuss how today’s workloads require a new model and a new platform for development and execution. The platform must handle a wide range of recent developments, including containers and Docker, distributed resource management, and DevOps tool chains and processes. The resulting infrastructure and management framework must be optimized for distributed and scalable applications, take advantage of innovation stemming from a wide variety of open source projects, span hybrid environments, and be adaptable to equally fundamental changes happen...
For years the world's most security-focused and distributed organizations - banks, military/defense agencies, global enterprises - have sought to adopt cloud technologies that can reduce costs, future-proof against data growth, and improve user productivity. The challenges of cloud transformation for these kinds of secure organizations have centered around data security, migration from legacy systems, and performance. In our presentation, we will discuss the notion that cloud computing, properly managed, is poised to bring about a digital transformation to enterprise IT. We will discuss the trend, the technology and the timeline for adoption.
Bill Schmarzo, Tech Chair of "Big Data | Analytics" of upcoming CloudEXPO | DXWorldEXPO New York (November 12-13, 2018, New York City) today announced the outline and schedule of the track. "The track has been designed in experience/degree order," said Schmarzo. "So, that folks who attend the entire track can leave the conference with some of the skills necessary to get their work done when they get back to their offices. It actually ties back to some work that I'm doing at the University of San Francisco which creates an "Outcomes-Centric Business Analytics" degree." Bill Schmarzo, author of "Big Data: Understanding How Data Powers Big Business" and "Big Data MBA: Driving Business Strategies with Data Science" is responsible for guiding the technology strategy within Hitachi Vantara for IoT and Analytics. Bill brings a balanced business-technology approach that focuses on business ou...
CloudEXPO New York 2018, colocated with DXWorldEXPO New York 2018 will be held November 11-13, 2018, in New York City and will bring together Cloud Computing, FinTech and Blockchain, Digital Transformation, Big Data, Internet of Things, DevOps, AI, Machine Learning and WebRTC to one location.
Wooed by the promise of faster innovation, lower TCO, and greater agility, businesses of every shape and size have embraced the cloud at every layer of the IT stack – from apps to file sharing to infrastructure. The typical organization currently uses more than a dozen sanctioned cloud apps and will shift more than half of all workloads to the cloud by 2018. Such cloud investments have delivered measurable benefits. But they’ve also resulted in some unintended side-effects: complexity and risk. End users now struggle to navigate multiple environments with varying degrees of performance. Companies are unclear on the security of their data and network access. And IT squads are overwhelmed trying to monitor and manage it all.