Welcome!

@CloudExpo Authors: Liz McMillan, Zakia Bouachraoui, Elizabeth White, Pat Romanski, Carmen Gonzalez

Article

To automate or not to automate

To automate or not to automate

Why are you still holding on to manual testing? Has the experience of an automation failing been so expensive? Or are you avoiding it because you don’t want to experience the former? The reasons for avoiding are many. I also agree they are legitimate ones. Some of the major ones include:

Complexity

Traditional a.k.a script based test automation are complex. All thanks to the need to ‘code’! Whether it is RFT, QTP, Selenium or any other test automation tool a tester needs to learn a scripting language specific to the automation tool, apply development practices to create a script of a manually tested test case. What if you could still automate without having your testing teams code scripts?

Test automation is expensive

Right from investing in automation tools to right resources who know the native language of the tool to make it work – traditional test automation requires huge amount of investment. Tool license to user ratio is 1:1 which implies even if you add one more test automation expert to your team, you have to purchase an additional license. What if you could share test automation tool licenses within your team?

Skill intensive

Yes, script based test automation calls for programmers. Often more than expected, good programmers come at a price. This increases the cost to automate. Don’t forget to include the costs of hiring and retaining these resources. Scripting also isolates the whole testing exercise from the other team members ( manual testers and subject matter experts). This is not intentional but a side effect of test automation. What if all levels of experience, domain and technical knowledge collaborate in test case creation?

Time intensive

Traditional test automation means scripting. To write scripts, test, and make them work requires considerable time. And time means more cost, literally. What if you could shorten this time and still ensure automation?

Even though the aforementioned reasons are real, have you ever thought what you are missing out on? Have you ever thought about:

Incomplete test coverage

The pressure of time-to-market pushes your testing team to not regress 100%.

Time-to-market

Even if you decide to manually regress for 100% test coverage, imagine how much time would you save if you would automate regression? Imagine how fast you can hit the market? Imagine the first mover advantage?

Human errors

Even though to err is human, testing world can’t use this as an excuse for a bug that gets noticed by the client. Manual regression has the potential to become mundane which can result in unidentified bugs.

Expensive

This not only encapsulates resource cost, tool cost, infrastructure cost but cost the of increased time to market, the unidentified bugs due to lack of 100 % regression or just sheer boredom. These are all costs and opportunities you can’t condone.

Manual testing is absolutely essential. Test automation cannot replace manual testing. If testing is means to build high quality software, then test automation is a means to mean. You would agree, it is one of the effective strategies to build optimum software testing process. But the dilemma continues…

One one hand lies to cost of developing, maintaining and operating test automation. And on the other hand lies of the cost of increased time to market and the unidentified bugs due to lack of 100 % manual regression. What would you choose?

What if I propose to decrease the cost of automation and still offer all the benefits of test automation? What if you could still automate without having your testing teams code scripts? What if you could share test automation tool licenses within your team? What if all levels of experience, domain and technical knowledge collaborate in test case creation? What if you could shorten the time to automate and increase the test coverage?

Would you still automate or not?

More Stories By Monica Paul

Monica Paul is a Digital Marketing Specialist at Qualitia Software - a leader in scriptless automation. She is has 6+ years of experience and has worked withthe world's leading products and services companies. She has been a speaker and judge at tech conferences. She loves talking to and guiding start-ups, right now she is sharing information on Time to market for testing.

CloudEXPO Stories
ScaleMP is presenting at CloudEXPO 2019, held June 24-26 in Santa Clara, and we’d love to see you there. At the conference, we’ll demonstrate how ScaleMP is solving one of the most vexing challenges for cloud — memory cost and limit of scale — and how our innovative vSMP MemoryONE solution provides affordable larger server memory for the private and public cloud. Please visit us at Booth No. 519 to connect with our experts and learn more about vSMP MemoryONE and how it is already serving some of the world’s largest data centers. Click here to schedule a meeting with our experts and executives.
At CloudEXPO Silicon Valley, June 24-26, 2019, Digital Transformation (DX) is a major focus with expanded DevOpsSUMMIT and FinTechEXPO programs within the DXWorldEXPO agenda. Successful transformation requires a laser focus on being data-driven and on using all the tools available that enable transformation if they plan to survive over the long term. A total of 88% of Fortune 500 companies from a generation ago are now out of business. Only 12% still survive. Similar percentages are found throughout enterprises of all sizes.
When you're operating multiple services in production, building out forensics tools such as monitoring and observability becomes essential. Unfortunately, it is a real challenge balancing priorities between building new features and tools to help pinpoint root causes. Linkerd provides many of the tools you need to tame the chaos of operating microservices in a cloud native world. Because Linkerd is a transparent proxy that runs alongside your application, there are no code changes required. It even comes with Prometheus to store the metrics for you and pre-built Grafana dashboards to show exactly what is important for your services - success rate, latency, and throughput.
In his general session at 21st Cloud Expo, Greg Dumas, Calligo’s Vice President and G.M. of US operations, discussed the new Global Data Protection Regulation and how Calligo can help business stay compliant in digitally globalized world. Greg Dumas is Calligo's Vice President and G.M. of US operations. Calligo is an established service provider that provides an innovative platform for trusted cloud solutions. Calligo’s customers are typically most concerned about GDPR compliance, application performance guarantees & data privacy.
Modern software design has fundamentally changed how we manage applications, causing many to turn to containers as the new virtual machine for resource management. As container adoption grows beyond stateless applications to stateful workloads, the need for persistent storage is foundational - something customers routinely cite as a top pain point. In his session at @DevOpsSummit at 21st Cloud Expo, Bill Borsari, Head of Systems Engineering at Datera, explored how organizations can reap the benefits of the cloud without losing performance as containers become the new paradigm.