Welcome!

@CloudExpo Authors: Zakia Bouachraoui, Elizabeth White, Liz McMillan, Pat Romanski, Yeshim Deniz

Related Topics: @CloudExpo, Containers Expo Blog, @DevOpsSummit

@CloudExpo: Blog Feed Post

Virtualization in Definition-Driven API Development | #CloudExpo #API #Cloud #Virtualization

The shift to an agile software development process has helped teams accelerate time-to-market

Virtualization in Definition-Driven API Development
By Ryan Pinkham

Agile development is highly iterative, and relies upon the rapid development of software and early feedback. These practices are well understood and practiced in mature application development teams.

The shift to an agile software development process has helped teams accelerate time to market, improve quality and reduce costs. With new technologies and processes that have hit the market, these same benefits can now be delivered to API development teams.

API definition on Agile teams
API definition formats, like OpenAPI Specification (formerly the Swagger Specification), have given API developers the ability to write a language-agnostic definition for their REST APIs. An API definition can be thought of a "contract" between the person/organization developing the API, and the consumers that integrate with it. A properly defined API helps eliminate the guess work that consumers often deal with when calling an API.

Defining your API with a formal API specification supports a contract-first approaches to API development, which focuses on designing the interface of your API and letting tooling generate documentation, code, and SDKs. In this approach, virtualization is key.

The importance of virtualization
Defining an API should be an iterative process. A developer will make tweaks based on how the API will actually behave when a client interacts with it. As a developer, the best way to truly understand how your API will behave, from an end-user perspective, is to create a fake version (or mock) of your API.

Utilizing a virtualization solution into your API development workflow, enables you to effectively preview how your API will behave in a given situation, solicit rapid feedback, and validate design decisions.

What does this look like in action?
Combining API definition with API virtualization allows development teams to rapidly specify and prototype, and even test their projects, all before writing any code. Putting this strategy into action requires having the right tools in place.

Read the original blog entry...

More Stories By SmartBear Blog

As the leader in software quality tools for the connected world, SmartBear supports more than two million software professionals and over 25,000 organizations in 90 countries that use its products to build and deliver the world’s greatest applications. With today’s applications deploying on mobile, Web, desktop, Internet of Things (IoT) or even embedded computing platforms, the connected nature of these applications through public and private APIs presents a unique set of challenges for developers, testers and operations teams. SmartBear's software quality tools assist with code review, functional and load testing, API readiness as well as performance monitoring of these modern applications.

CloudEXPO Stories
Adding public cloud resources to an existing application can be a daunting process. The tools that you currently use to manage the software and hardware outside the cloud aren’t always the best tools to efficiently grow into the cloud. All of the major configuration management tools have cloud orchestration plugins that can be leveraged, but there are also cloud-native tools that can dramatically improve the efficiency of managing your application lifecycle. In his session at 18th Cloud Expo, Alex Lovell-Troy, Director of Solutions Engineering at Pythian, presented a roadmap that can be leveraged by any organization to plan, analyze, evaluate, and execute on moving from configuration management tools to cloud orchestration tools. He also addressed the three major cloud vendors as well as some tools that will work with any cloud.
When applications are hosted on servers, they produce immense quantities of logging data. Quality engineers should verify that apps are producing log data that is existent, correct, consumable, and complete. Otherwise, apps in production are not easily monitored, have issues that are difficult to detect, and cannot be corrected quickly. Tom Chavez presents the four steps that quality engineers should include in every test plan for apps that produce log output or other machine data. Learn the steps so your team's apps not only function but also can be monitored and understood from their machine data when running in production.
DXWordEXPO New York 2018, colocated with CloudEXPO New York 2018 will be held November 11-13, 2018, in New York City and will bring together Cloud Computing, FinTech and Blockchain, Digital Transformation, Big Data, Internet of Things, DevOps, AI, Machine Learning and WebRTC to one location.
With more than 30 Kubernetes solutions in the marketplace, it's tempting to think Kubernetes and the vendor ecosystem has solved the problem of operationalizing containers at scale or of automatically managing the elasticity of the underlying infrastructure that these solutions need to be truly scalable. Far from it. There are at least six major pain points that companies experience when they try to deploy and run Kubernetes in their complex environments. In this presentation, the speaker will detail these pain points and explain how cloud can address them.
Discussions of cloud computing have evolved in recent years from a focus on specific types of cloud, to a world of hybrid cloud, and to a world dominated by the APIs that make today's multi-cloud environments and hybrid clouds possible. In this Power Panel at 17th Cloud Expo, moderated by Conference Chair Roger Strukhoff, panelists addressed the importance of customers being able to use the specific technologies they need, through environments and ecosystems that expose their APIs to make true change and transformation possible.