Welcome!

@CloudExpo Authors: Elizabeth White, Kong Yang, Jason Bloomberg, John Rauser, Mark Leake

Related Topics: @CloudExpo, Agile Computing, Cloud Security

@CloudExpo: Blog Feed Post

Control Costs: Use Cloud Test Environments | @CloudExpo #API #Cloud #Agile

Automation is the key to cloud-based cost savings

Control Costs: Use Cloud Test Environments
By Dalibor Siroky

There is no more effective way to reduce your overall test environment spend than to migrate to cloud test environments and embrace testing and infrastructure automation. The nature of test environments is inherently temporary—you set up an environment, run through an automated test suite, and then tear down the environment. If you can reduce the cycle time for this process down to hours or minutes, then you may be able to cut your test environment budgets considerably.

This is the “Holy Grail” for both cost savings and agility. This on-demand model allows you to take advantage of public cloud APIs and only pay for the time you need to run through automated tests. Its success depends on two things: automated infrastructure and automated acceptance tests. If you can reach these two goals, the cost savings from reduced infrastructure spend and the increase in agility will more than pay for the cost of both of these automation initiatives.

Automation is the Key to Cloud-Based Cost Savings
It isn’t enough to just move “to the Cloud.” If you really want to take advantage of cloud-based test environments, you need to change the way you view these environments. Everything about them should be automated, and you should be able to stand up and tear down an environment quickly. If you can spin up and spin down environments as needed, and if you can automate the setup of test data then you create a continuous deployment pipeline, which views the creation of a test environment as just another step in a build pipeline. The key is to get rid of manual testing so that test environments don’t need to exist any longer than it takes to execute a test suite.

Here’s an example. Assume you work on an application that runs on Tomcat. You track source code in Git, you build your application with Jenkins, and you publish a build artifact to a repository such as Nexus. You have a continuous integration server running 24/7. Any time a developer pushes a commit to your repository your system kicks off a build, and runs a comprehensive suite of unit tests. For years, your team was satisfied with running continuous integration that ended at unit tests. When your project approached a release, a QA team would run through a series of manual checks against a static QA environment. That QA environment had to be maintained 365 days a year, but it was only used for maybe 30 days a year. It was inefficient, and QA was a constant source of configuration and management issues.

With the advent of continuous deployment, your team is now extending the build pipeline to create QA environments dynamically only when needed. Your QA team is also committed to building out automated test suites that can be run as part of your continuous deployment pipeline. In this new environment, no QA tester ever needs to run a test; the entire, end-to-end test suite can be automated. Instead of having to worry about keeping a separate environment up and running your build pipeline creates it automatically, runs the test suite automatically, and then tears it down once testing is complete. It does all of this using public cloud APIs on a service such as Amazon AWS, Microsoft Azure, or Rackspace.

Using cloud-based infrastructure APIs alongside deployment automation tools means that you only pay for the time you use. Your static QA environment might have cost something like $30,000 to maintain every year, and your new, dynamic approach only runs for a few hours a week keeping your effective annual cost under $5,000. Another benefit is that multiple test environments can be created at the same time without conflict. If the project is experiencing heavy development, your new pipeline can spin up a new, ephemeral test environment for multiple commits on multiple branches at the same time, for a fraction of the cost of a dedicated test environment.

Cloud Test Environments: Shift to OpEx, Save Money
The impact of this approach cannot be overstated. It is the future of test environments.
If you have a software project that requires a full QA regression test and you create a script to automate both the execution of these tests and the creation and tear down of an isolated test environment, then you have changed almost everything about Test Environment Management. In this scenario, a code commit can trigger the dynamic creation of parallel test environments in the Cloud which may only need to exist for a handful of hours. In a pay-as-you-go environment on a public cloud, this means that your budgets are calculated not by how much it costs to rent an environment for a month or a year, but how much it costs to pay for transient cloud-based resources during a single regression test.

It might cost you $1000 per month to run a QA environment. Think about the cost of running several environments for only a few hours at a time in a given month. During busy development weeks, you might spend $12 per test suite execution and run a regression test twice a day. During a slow week, you might not run any tests at all. Cloud-based infrastructure and automation allow you to reduce the on-going costs associated with test environments dramatically.

Get Automated with Plutora
When we help our clients make a transition to cloud-based test environments, the first thing we help them do is stand up Plutora so that they can track the effort required to support, create, and tear down test environments. Once you’ve installed Plutora and started to keep track of your environments, you’ll appreciate the real cost of keeping several test environments up and running 24/7 for projects that may only need them for a few hours a week.

You’ll also be able to see which environments are in the most demand, and which projects are the biggest drivers of both environment demand and environment conflict. Using Plutora as a Test Environment Management tool gives you the opportunity to identify the environments that would benefit the most from a migration to automated, cloud-based infrastructure.

The post Control Costs: Use Cloud Test Environments appeared first on Plutora.

Read the original blog entry...

More Stories By Plutora Blog

Plutora provides Enterprise Release and Test Environment Management SaaS solutions aligning process, technology, and information to solve release orchestration challenges for the enterprise.

Plutora’s SaaS solution enables organizations to model release management and test environment management activities as a bridge between agile project teams and an enterprise’s ITSM initiatives. Using Plutora, you can orchestrate parallel releases from several independent DevOps groups all while giving your executives as well as change management specialists insight into overall risk.

Supporting the largest releases for the largest organizations throughout North America, EMEA, and Asia Pacific, Plutora provides proof that large companies can adopt DevOps while managing the risks that come with wider adoption of self-service and agile software development in the enterprise. Aligning process, technology, and information to solve increasingly complex release orchestration challenges, this Gartner “Cool Vendor in IT DevOps” upgrades the enterprise release management from spreadsheets, meetings, and email to an integrated dashboard giving release managers insight and control over large software releases.

@CloudExpo Stories
Automation is enabling enterprises to design, deploy, and manage more complex, hybrid cloud environments. Yet the people who manage these environments must be trained in and understanding these environments better than ever before. A new era of analytics and cognitive computing is adding intelligence, but also more complexity, to these cloud environments. How smart is your cloud? How smart should it be? In this power panel at 20th Cloud Expo, moderated by Conference Chair Roger Strukhoff, paneli...
In his session at @ThingsExpo, Eric Lachapelle, CEO of the Professional Evaluation and Certification Board (PECB), provided an overview of various initiatives to certify the security of connected devices and future trends in ensuring public trust of IoT. Eric Lachapelle is the Chief Executive Officer of the Professional Evaluation and Certification Board (PECB), an international certification body. His role is to help companies and individuals to achieve professional, accredited and worldwide re...
It is ironic, but perhaps not unexpected, that many organizations who want the benefits of using an Agile approach to deliver software use a waterfall approach to adopting Agile practices: they form plans, they set milestones, and they measure progress by how many teams they have engaged. Old habits die hard, but like most waterfall software projects, most waterfall-style Agile adoption efforts fail to produce the results desired. The problem is that to get the results they want, they have to ch...
The current age of digital transformation means that IT organizations must adapt their toolset to cover all digital experiences, beyond just the end users’. Today’s businesses can no longer focus solely on the digital interactions they manage with employees or customers; they must now contend with non-traditional factors. Whether it's the power of brand to make or break a company, the need to monitor across all locations 24/7, or the ability to proactively resolve issues, companies must adapt to...
Wooed by the promise of faster innovation, lower TCO, and greater agility, businesses of every shape and size have embraced the cloud at every layer of the IT stack – from apps to file sharing to infrastructure. The typical organization currently uses more than a dozen sanctioned cloud apps and will shift more than half of all workloads to the cloud by 2018. Such cloud investments have delivered measurable benefits. But they’ve also resulted in some unintended side-effects: complexity and risk. ...
With major technology companies and startups seriously embracing Cloud strategies, now is the perfect time to attend 21st Cloud Expo October 31 - November 2, 2017, at the Santa Clara Convention Center, CA, and June 12-14, 2018, at the Javits Center in New York City, NY, and learn what is going on, contribute to the discussions, and ensure that your enterprise is on the right path to Digital Transformation.
In 2014, Amazon announced a new form of compute called Lambda. We didn't know it at the time, but this represented a fundamental shift in what we expect from cloud computing. Now, all of the major cloud computing vendors want to take part in this disruptive technology. In his session at 20th Cloud Expo, Doug Vanderweide, an instructor at Linux Academy, discussed why major players like AWS, Microsoft Azure, IBM Bluemix, and Google Cloud Platform are all trying to sidestep VMs and containers wit...
The taxi industry never saw Uber coming. Startups are a threat to incumbents like never before, and a major enabler for startups is that they are instantly “cloud ready.” If innovation moves at the pace of IT, then your company is in trouble. Why? Because your data center will not keep up with frenetic pace AWS, Microsoft and Google are rolling out new capabilities. In his session at 20th Cloud Expo, Don Browning, VP of Cloud Architecture at Turner, posited that disruption is inevitable for comp...
No hype cycles or predictions of zillions of things here. IoT is big. You get it. You know your business and have great ideas for a business transformation strategy. What comes next? Time to make it happen. In his session at @ThingsExpo, Jay Mason, Associate Partner at M&S Consulting, presented a step-by-step plan to develop your technology implementation strategy. He discussed the evaluation of communication standards and IoT messaging protocols, data analytics considerations, edge-to-cloud tec...
New competitors, disruptive technologies, and growing expectations are pushing every business to both adopt and deliver new digital services. This ‘Digital Transformation’ demands rapid delivery and continuous iteration of new competitive services via multiple channels, which in turn demands new service delivery techniques – including DevOps. In this power panel at @DevOpsSummit 20th Cloud Expo, moderated by DevOps Conference Co-Chair Andi Mann, panelists examined how DevOps helps to meet the de...
When growing capacity and power in the data center, the architectural trade-offs between server scale-up vs. scale-out continue to be debated. Both approaches are valid: scale-out adds multiple, smaller servers running in a distributed computing model, while scale-up adds fewer, more powerful servers that are capable of running larger workloads. It’s worth noting that there are additional, unique advantages that scale-up architectures offer. One big advantage is large memory and compute capacity...
"When we talk about cloud without compromise what we're talking about is that when people think about 'I need the flexibility of the cloud' - it's the ability to create applications and run them in a cloud environment that's far more flexible,” explained Matthew Finnie, CTO of Interoute, in this SYS-CON.tv interview at 20th Cloud Expo, held June 6-8, 2017, at the Javits Center in New York City, NY.
Cloud applications are seeing a deluge of requests to support the exploding advanced analytics market. “Open analytics” is the emerging strategy to deliver that data through an open data access layer, in the cloud, to be directly consumed by external analytics tools and popular programming languages. An increasing number of data engineers and data scientists use a variety of platforms and advanced analytics languages such as SAS, R, Python and Java, as well as frameworks such as Hadoop and Spark...
The Internet giants are fully embracing AI. All the services they offer to their customers are aimed at drawing a map of the world with the data they get. The AIs from these companies are used to build disruptive approaches that cannot be used by established enterprises, which are threatened by these disruptions. However, most leaders underestimate the effect this will have on their businesses. In his session at 21st Cloud Expo, Rene Buest, Director Market Research & Technology Evangelism at Ara...
Join us at Cloud Expo June 6-8 to find out how to securely connect your cloud app to any cloud or on-premises data source – without complex firewall changes. More users are demanding access to on-premises data from their cloud applications. It’s no longer a “nice-to-have” but an important differentiator that drives competitive advantages. It’s the new “must have” in the hybrid era. Users want capabilities that give them a unified view of the data to get closer to customers and grow business. The...
"We are a monitoring company. We work with Salesforce, BBC, and quite a few other big logos. We basically provide monitoring for them, structure for their cloud services and we fit into the DevOps world" explained David Gildeh, Co-founder and CEO of Outlyer, in this SYS-CON.tv interview at DevOps Summit at 20th Cloud Expo, held June 6-8, 2017, at the Javits Center in New York City, NY.
"Loom is applying artificial intelligence and machine learning into the entire log analysis process, from start to finish and at the end you will get a human touch,” explained Sabo Taylor Diab, Vice President, Marketing at Loom Systems, in this SYS-CON.tv interview at 20th Cloud Expo, held June 6-8, 2017, at the Javits Center in New York City, NY.
"Tintri focuses on the Ops side of the DevOps, which basically is pushing more and more of the accessibility of the infrastructure to the developers and trying to get behind the scenes," explained Dhiraj Sehgal of Tintri in this SYS-CON.tv interview at 20th Cloud Expo, held June 6-8, 2017, at the Javits Center in New York City, NY.
@DevOpsSummit at Cloud Expo taking place Oct 31 - Nov 2, 2017, at the Santa Clara Convention Center, Santa Clara, CA, is co-located with the 21st International Cloud Expo and will feature technical sessions from a rock star conference faculty and the leading industry players in the world. The widespread success of cloud computing is driving the DevOps revolution in enterprise IT. Now as never before, development teams must communicate and collaborate in a dynamic, 24/7/365 environment. There is ...
You know you need the cloud, but you’re hesitant to simply dump everything at Amazon since you know that not all workloads are suitable for cloud. You know that you want the kind of ease of use and scalability that you get with public cloud, but your applications are architected in a way that makes the public cloud a non-starter. You’re looking at private cloud solutions based on hyperconverged infrastructure, but you’re concerned with the limits inherent in those technologies.