Welcome!

@CloudExpo Authors: Elizabeth White, Liz McMillan, Pat Romanski, Nishanth Kadiyala, William Schmarzo

Related Topics: @CloudExpo, Java IoT, Cloud Security

@CloudExpo: Blog Post

Moving Test Environments to the Cloud By @EFeatherston | @CloudExpo #Cloud

Is that a good first step?

How often do you get questions like ‘Have you gone to the cloud yet?', or ‘Why aren't we in the cloud?', or a myriad of others along those same lines. People still talk like the cloud is a destination. I discussed this tendency last year in a blog, "The Cloud - Is It Your Actual Destination?" For all the hype surrounding cloud, the benefits are real. IDC forecasts global public IT Cloud services spending to reach nearly $108B by 2017. Gartner expects that by 2016 the bulk of IT spend will be for the cloud. Those are pretty impressive numbers. The challenge is to remember, cloud is technology, it's a vehicle, a conduit to help you provide value to the business.

The real challenge is determining what should move to the cloud. Do I go to public cloud or build a private cloud? Is hybrid the right choice? What benefits provide the best value to the business? Do I move everything? These are just some of the questions that you should be asking yourself. There is no simple one-size fits all answer. No technology, including cloud, negates the need for good design and planning. How does one make sense of this? You need to put a process in place to that identifies, weighs and balances the business needs with the technical challenges. One potential area to look at as a first step is your test environments. I can already see some of your eyes rolling, but let's take a look at some of the potential benefits of doing that.

The challenge of QA and performance test environments
In house QA and performance test environments have always presented challenges, and many times been the forgotten step child in the development process. Every group wants:

  • Their own isolated, dedicated test environments
  • To run on independent build/test/deploy schedules
  • To be able to quickly configure/deploy/test changes both on cycle and for hot fixes
  • To performance test for scale, load, and capacity so they can understand breakpoints before going live

In the ideal world, with unlimited resources and money, this would be easy. Unfortunately, in the real world, setting up that kind of dedicated testing environments tends to be cost, space, and resource constrained. Hardware/space/resources need to be shared, scheduled, and may run into conflicts. The end result is shared resources, scaled down in size, which may or may not be a good representation of the production environment. Keeping large amounts of hardware around that is not utilized all the time becomes a hard sell, and not one that is usually won. Sharing these resources helps, but inevitably there are schedule conflicts, one project impacting another during the testing cycles. Striking a balance is a pain point many of us are all too familiar with.

Stand up, deploy, test, breakdown, repeat
Public cloud providers can be a viable option to help address the challenges discussed. Instead of the classic purchase, configure, and support your own hardware environments, you go to the cloud provider. The public cloud providers have three key benefits to address the testing challenges

  • You can obtain your own dedicated testing environment resources when you need them
  • You only pay for what you use
  • You can configure for dynamic resource usage, allowing for true performance testing for application scaling and identifying breakpoints

On demand testing resources when you need them. Stand up a test environment, deploy your app, test it, and break it back down when you are done. Dynamic scaling for performance. Sounds wonderful, what's the catch? No technology negates the need for good planning and design. The cloud is no exception. To take advantage of the benefits the cloud can provide your testing needs there are some items to consider and plan for in order to succeed.

  • Develop a set of setup configuration and deployment scripts for your cloud environment. All public cloud vendors provide a scripting mechanism for setup, deploy, and breakdown of environments. The added benefit of doing this is you have a standard, repeatable mechanism for standing up a consistent testing environment. Make sure to place these scripts under source control.
  • Develop a test data management strategy. If you have one already, revisit to determine if there is any impact caused by moving into the public cloud space. (See my recent blog on test data management in the cloud). As with the environment setup, develop and maintain scripts so that you have a repeatable process for standing up both the environment and data.
  • Dynamic scaling is only helpful if your application is designed for horizontal scalability. Again, plan and design are key to be able to leverage the benefit.

Does it make sense for your organization and needs
Using public cloud infrastructure as a service offerings for test environments can definitely have benefits. It can also be an attractive first step and toe in the water for your organizations foray into the world of cloud. Is it for everyone? Not necessarily. You need to evaluate your readiness and ability to leverage the benefits that can be found. If you are facing the testing challenges discussed, this step could be for you. Success is not automatic. If you don't already have a well-disciplined DevOps process, this could be a good step in developing one. As discussed, standardizing, scripts, source control, process, are all critical to leveraging the cloud for your test environments. Cloud technology will not make the magic and success happen. It does provide the opportunity and capability. With proper planning and design, using test environments as your first step, it very well could be the right choice for your organization to start its successful leveraging of cloud technologies.

This post is brought to you by Cloud for Tomorrow.

More Stories By Ed Featherston

Ed Featherston is VP, Principal Architect at Cloud Technology Partners. He brings 35 years of technology experience in designing, building, and implementing large complex solutions. He has significant expertise in systems integration, Internet/intranet, and cloud technologies. He has delivered projects in various industries, including financial services, pharmacy, government and retail.

Comments (0)

Share your thoughts on this story.

Add your comment
You must be signed in to add a comment. Sign-in | Register

In accordance with our Comment Policy, we encourage comments that are on topic, relevant and to-the-point. We will remove comments that include profanity, personal attacks, racial slurs, threats of violence, or other inappropriate material that violates our Terms and Conditions, and will block users who make repeated violations. We ask all readers to expect diversity of opinion and to treat one another with dignity and respect.


@CloudExpo Stories
No hype cycles or predictions of zillions of things here. IoT is big. You get it. You know your business and have great ideas for a business transformation strategy. What comes next? Time to make it happen. In his session at @ThingsExpo, Jay Mason, Associate Partner at M&S Consulting, presented a step-by-step plan to develop your technology implementation strategy. He discussed the evaluation of communication standards and IoT messaging protocols, data analytics considerations, edge-to-cloud tec...
New competitors, disruptive technologies, and growing expectations are pushing every business to both adopt and deliver new digital services. This ‘Digital Transformation’ demands rapid delivery and continuous iteration of new competitive services via multiple channels, which in turn demands new service delivery techniques – including DevOps. In this power panel at @DevOpsSummit 20th Cloud Expo, moderated by DevOps Conference Co-Chair Andi Mann, panelists examined how DevOps helps to meet the de...
When growing capacity and power in the data center, the architectural trade-offs between server scale-up vs. scale-out continue to be debated. Both approaches are valid: scale-out adds multiple, smaller servers running in a distributed computing model, while scale-up adds fewer, more powerful servers that are capable of running larger workloads. It’s worth noting that there are additional, unique advantages that scale-up architectures offer. One big advantage is large memory and compute capacity...
"When we talk about cloud without compromise what we're talking about is that when people think about 'I need the flexibility of the cloud' - it's the ability to create applications and run them in a cloud environment that's far more flexible,” explained Matthew Finnie, CTO of Interoute, in this SYS-CON.tv interview at 20th Cloud Expo, held June 6-8, 2017, at the Javits Center in New York City, NY.
Cloud applications are seeing a deluge of requests to support the exploding advanced analytics market. “Open analytics” is the emerging strategy to deliver that data through an open data access layer, in the cloud, to be directly consumed by external analytics tools and popular programming languages. An increasing number of data engineers and data scientists use a variety of platforms and advanced analytics languages such as SAS, R, Python and Java, as well as frameworks such as Hadoop and Spark...
The Internet giants are fully embracing AI. All the services they offer to their customers are aimed at drawing a map of the world with the data they get. The AIs from these companies are used to build disruptive approaches that cannot be used by established enterprises, which are threatened by these disruptions. However, most leaders underestimate the effect this will have on their businesses. In his session at 21st Cloud Expo, Rene Buest, Director Market Research & Technology Evangelism at Ara...
Join us at Cloud Expo June 6-8 to find out how to securely connect your cloud app to any cloud or on-premises data source – without complex firewall changes. More users are demanding access to on-premises data from their cloud applications. It’s no longer a “nice-to-have” but an important differentiator that drives competitive advantages. It’s the new “must have” in the hybrid era. Users want capabilities that give them a unified view of the data to get closer to customers and grow business. The...
"We are a monitoring company. We work with Salesforce, BBC, and quite a few other big logos. We basically provide monitoring for them, structure for their cloud services and we fit into the DevOps world" explained David Gildeh, Co-founder and CEO of Outlyer, in this SYS-CON.tv interview at DevOps Summit at 20th Cloud Expo, held June 6-8, 2017, at the Javits Center in New York City, NY.
The current age of digital transformation means that IT organizations must adapt their toolset to cover all digital experiences, beyond just the end users’. Today’s businesses can no longer focus solely on the digital interactions they manage with employees or customers; they must now contend with non-traditional factors. Whether it's the power of brand to make or break a company, the need to monitor across all locations 24/7, or the ability to proactively resolve issues, companies must adapt to...
"Loom is applying artificial intelligence and machine learning into the entire log analysis process, from start to finish and at the end you will get a human touch,” explained Sabo Taylor Diab, Vice President, Marketing at Loom Systems, in this SYS-CON.tv interview at 20th Cloud Expo, held June 6-8, 2017, at the Javits Center in New York City, NY.
"Tintri focuses on the Ops side of the DevOps, which basically is pushing more and more of the accessibility of the infrastructure to the developers and trying to get behind the scenes," explained Dhiraj Sehgal of Tintri in this SYS-CON.tv interview at 20th Cloud Expo, held June 6-8, 2017, at the Javits Center in New York City, NY.
@DevOpsSummit at Cloud Expo taking place Oct 31 - Nov 2, 2017, at the Santa Clara Convention Center, Santa Clara, CA, is co-located with the 21st International Cloud Expo and will feature technical sessions from a rock star conference faculty and the leading industry players in the world. The widespread success of cloud computing is driving the DevOps revolution in enterprise IT. Now as never before, development teams must communicate and collaborate in a dynamic, 24/7/365 environment. There is ...
You know you need the cloud, but you’re hesitant to simply dump everything at Amazon since you know that not all workloads are suitable for cloud. You know that you want the kind of ease of use and scalability that you get with public cloud, but your applications are architected in a way that makes the public cloud a non-starter. You’re looking at private cloud solutions based on hyperconverged infrastructure, but you’re concerned with the limits inherent in those technologies.
A look across the tech landscape at the disruptive technologies that are increasing in prominence and speculate as to which will be most impactful for communications – namely, AI and Cloud Computing. In his session at 20th Cloud Expo, Curtis Peterson, VP of Operations at RingCentral, highlighted the current challenges of these transformative technologies and shared strategies for preparing your organization for these changes. This “view from the top” outlined the latest trends and developments i...
The current age of digital transformation means that IT organizations must adapt their toolset to cover all digital experiences, beyond just the end users’. Today’s businesses can no longer focus solely on the digital interactions they manage with employees or customers; they must now contend with non-traditional factors. Whether it's the power of brand to make or break a company, the need to monitor across all locations 24/7, or the ability to proactively resolve issues, companies must adapt to...
"We focus on composable infrastructure. Composable infrastructure has been named by companies like Gartner as the evolution of the IT infrastructure where everything is now driven by software," explained Bruno Andrade, CEO and Founder of HTBase, in this SYS-CON.tv interview at 20th Cloud Expo, held June 6-8, 2017, at the Javits Center in New York City, NY.
Hardware virtualization and cloud computing allowed us to increase resource utilization and increase our flexibility to respond to business demand. Docker Containers are the next quantum leap - Are they?! Databases always represented an additional set of challenges unique to running workloads requiring a maximum of I/O, network, CPU resources combined with data locality.
For organizations that have amassed large sums of software complexity, taking a microservices approach is the first step toward DevOps and continuous improvement / development. Integrating system-level analysis with microservices makes it easier to change and add functionality to applications at any time without the increase of risk. Before you start big transformation projects or a cloud migration, make sure these changes won’t take down your entire organization.
Cloud promises the agility required by today’s digital businesses. As organizations adopt cloud based infrastructures and services, their IT resources become increasingly dynamic and hybrid in nature. Managing these require modern IT operations and tools. In his session at 20th Cloud Expo, Raj Sundaram, Senior Principal Product Manager at CA Technologies, will discuss how to modernize your IT operations in order to proactively manage your hybrid cloud and IT environments. He will be sharing bes...
Artificial intelligence, machine learning, neural networks. We’re in the midst of a wave of excitement around AI such as hasn’t been seen for a few decades. But those previous periods of inflated expectations led to troughs of disappointment. Will this time be different? Most likely. Applications of AI such as predictive analytics are already decreasing costs and improving reliability of industrial machinery. Furthermore, the funding and research going into AI now comes from a wide range of com...