Welcome!

@CloudExpo Authors: Elizabeth White, Pat Romanski, Yeshim Deniz, Liz McMillan, Zakia Bouachraoui

Related Topics: @CloudExpo, Java IoT, @DevOpsSummit

@CloudExpo: Blog Feed Post

How Developers and Testers Can Work Together By @JustinRohrman | @DevOpsSummit #DevOps

There are a lot of challenges in UI automation - timing, selecting the right tests, figuring how to make tests powerful

How Developers and Testers Can Work Together to Overcome the Challenge of Object Recognition
By Justin Rohrman

Webpages are fickle.

Right when I think I know where everything is a label changes, a button is moved a couple of pixels to somewhere completely different in the page. For the last year or so, I have been working full time on a project building automated checks for a user interface.

There are a lot of challenges in UI automation - timing, selecting the right tests, figuring how to make tests powerful - but handling page objects is always the first I have to figure out.

Here are a few object recognition challenges I have come across and how I got past them.

The Elusive Object
I have come across a lot of projects trying to script away some testing problems in the user interface, but not a lot of success stories. There are a couple of patterns I notice in these hard lessons. The first one is having one person on the test team spend a week building a proof of concept.

Usually that person will spend a couple days getting their head around the tool, and then a couple more building or recording a few checks to demonstrate. When I did this, the demo was a nightmare. These one-off scripts would fail on and off, and in different ways each time. Management understandably lost interest and we took another route.

The other pattern throws caution out and charges forward. The first automation project I worked on was like this. We built a proof of concept that was small and simple enough that it mislead the team into thinking things would be simple. Over the next 6 months we built some shaky infrastructure and a little over a hundred checks. Unfortunately, we rarely had more than half of those passing during the nightly runs. We ended up with an expensive source of information that couldn't be trusted.

Usually when these failed, the script just plain couldn't find the screen element it was supposed be touching. We would kick off a script, it would navigate to the next page and then start the search and the page would load and then, boom; red light.

This can happen for a few reasons.
Probably the most basic problem you'll ever have is dealing with pixels. The first tool I ever used to figure out this UI automation thing was for automating basic IT jobs like getting a nightly server backup running. When this tool performed a click, or typed into a text field, it found those fields by their pixel location on the screen. Any time something on that screen changed like a button changing size, or moving to another place on the same screen, or even the browser getting resized, the script would break. If this is how your tool finds objects, you should probably get a different tool.

The next journey was with a very basic framework, I think it was one of the first UI libraries build for Ruby. This tool used XPath to find everything on the page. It is a little bit like specifying a file path.

The first drawback was how slow the process of building checks was. Each time I wanted to touch a new button or field, I had to stop and open up developer tools to find the path. It was a little tedious. If we added a second table above the first, the locator would break. If the rows in the table are sorted and the default sort breaks, the locator would report the text on row 2, column 2 was now "incorrect." If the rows are orders, sorted newest first, and someone logs in as that user and creates another order, or deletes the first, the automated tests would report the test is, again, incorrect.

Tools have come a long way from those days.
An automated testing tool like TestComplete, provides built-in functionality to improve the stability of automated tests. The TestComplete platform works at the object level. This means that when TestComplete captures user actions over an application, it records more than just mouse clicks and simple screen coordinate-based actions.

Searching for an object based on it's ID in the DOM is one of the most commonly used methods right now. It isn't perfect, but it is much better than the alternative. Usually when this fails, it's because there is a timing issue. For example, I work on a product that is very data dense. Every page has expandable sections with buttons, and rows of data. When a test fails from object problems now, it's because the script is trying to interact with the page before everything loaded and ready.

Sometimes the ID is generated based on an order number - with a different button for each order number. This can be very challenging, especially if you want the tool to go through create an order, then click the "right" radio box. A human might be able to understand what to click, but coding that in software can be challenging.

Let's talk about how to code to make the job of test tooling easier, sometimes called testability requirements.

Develop for Testability
One of the stranger things I've seen in user interface automation isn't related to technology at all. When these projects start, managers decide that testers should build these systems all on their own. The marketing and product demos tend to support this idea; after all, why should test automation require the help of the people writing the production code? When the project starts moving along and the testers inevitably find a hitch, people get understandably frustrated.

The first thing we noticed was that we couldn't just lay the scripts on top of our product and have it run. Instead, the product itself needed to change, to add testability features. Originally, our software wasn't designed with any sort of UI automation in mind, so developers weren't very careful about adding IDs to the page elements. Each time I wanted to work on a new page, or even a new element on an old page, I had to put in a change request to get an ID added for that field. Hopefully I remembered all the fields on the first try. That created a one day lag at minimum before I could go to work, longer if something important was going on at the time.

The other thing we did was a larger effort. The initial tests we wrote were full of hard coded wait commands. Each button click, or navigation was followed by a 5 (or more) second wait to give the page time to load before moving on to the next step in the script. Those waits made the scripts incredibly brittle, we never really knew how long a page would take to load. If 5 seconds wasn't long enough, the script would try to click on a button or type into a field that wasn't there and fail. To get around this problem, we had to design a polling solution that would tell us when a page was ready to be used. Not just design it, but also convince the development managers that it was a good idea and in their best interest. The end result was a flag in the DOM that would be set to isReady='true' when the page was fully loaded.

Successful UI automation projects take effort from everyone.
The product I am working on now probably has the most successful UI automation work I have seen. The product is mostly legacy, it isn't getting new javascript libraries thrown in once a week and we don't have to worry about responsive design. It is built mostly on top of SQL stored procedures. Changes can be tricky because of that, sometimes a change in one place will break something in an area we had never considered. It's also very hard to unit test.

Every time a new feature is added or an old bit of functionality is changed somehow, the developers do that with the idea that I may need to add that into the suite of checks that runs over night. Every new element in the UI gets a static ID that hopefully doesn't change, and we try to minimize the number of fields that are added and removed dynamically. When it helps stability, the developers also work with me to write SQL scripts to seed data into the database.

Getting a strategy in order to handle your user interface objects will save a lot of pain and frustration down the road. The usual objections follow the old story of ‘we are already short on developers, we don't have time for that'.

A better question might be, do you have the time a couple months from now to untangle the web you are making today? I bet the answer is no, we'd rather handle that now.

Read the original blog entry...

More Stories By SmartBear Blog

As the leader in software quality tools for the connected world, SmartBear supports more than two million software professionals and over 25,000 organizations in 90 countries that use its products to build and deliver the world’s greatest applications. With today’s applications deploying on mobile, Web, desktop, Internet of Things (IoT) or even embedded computing platforms, the connected nature of these applications through public and private APIs presents a unique set of challenges for developers, testers and operations teams. SmartBear's software quality tools assist with code review, functional and load testing, API readiness as well as performance monitoring of these modern applications.

CloudEXPO Stories
DXWorldEXPO LLC announced today that Kevin Jackson joined the faculty of CloudEXPO's "10-Year Anniversary Event" which will take place on November 11-13, 2018 in New York City. Kevin L. Jackson is a globally recognized cloud computing expert and Founder/Author of the award winning "Cloud Musings" blog. Mr. Jackson has also been recognized as a "Top 100 Cybersecurity Influencer and Brand" by Onalytica (2015), a Huffington Post "Top 100 Cloud Computing Experts on Twitter" (2013) and a "Top 50 Cloud Computing Blogger for IT Integrators" by CRN (2015). Mr. Jackson's professional career includes service in the US Navy Space Systems Command, Vice President J.P. Morgan Chase, Worldwide Sales Executive for IBM and NJVC Vice President, Cloud Services. He is currently part of a team responsible for onboarding mission applications to the US Intelligence Community cloud computing environment (IC ...
Charles Araujo is an industry analyst, internationally recognized authority on the Digital Enterprise and author of The Quantum Age of IT: Why Everything You Know About IT is About to Change. As Principal Analyst with Intellyx, he writes, speaks and advises organizations on how to navigate through this time of disruption. He is also the founder of The Institute for Digital Transformation and a sought after keynote speaker. He has been a regular contributor to both InformationWeek and CIO Insight and has been quoted or published in Time, CIO, Computerworld, USA Today and Forbes.
When talking IoT we often focus on the devices, the sensors, the hardware itself. The new smart appliances, the new smart or self-driving cars (which are amalgamations of many ‘things'). When we are looking at the world of IoT, we should take a step back, look at the big picture. What value are these devices providing. IoT is not about the devices, its about the data consumed and generated. The devices are tools, mechanisms, conduits. This paper discusses the considerations when dealing with the massive amount of information associated with these devices. Ed presented sought out sessions at CloudEXPO Silicon Valley 2017 and CloudEXPO New York 2017. He is a regular contributor to Cloud Computing Journal.
Machine learning has taken residence at our cities' cores and now we can finally have "smart cities." Cities are a collection of buildings made to provide the structure and safety necessary for people to function, create and survive. Buildings are a pool of ever-changing performance data from large automated systems such as heating and cooling to the people that live and work within them. Through machine learning, buildings can optimize performance, reduce costs, and improve occupant comfort by sharing information within the building and with outside city infrastructure via real time shared cloud capabilities.
They say multi-cloud is coming, but organizations are leveraging multiple clouds already. According to a study by 451 Research, only 21% of organizations were using a single cloud. If you've found yourself unprepared for the barrage of cloud services introduced in your organization, you will need to change your approach to engaging with the business and engaging with vendors. Look at technologies that are on the way and work with the internal players involved to have a plan in place when the inevitable happens and the business begins to look at how these things can help affect your bottom line.