Welcome!

@CloudExpo Authors: Elizabeth White, Pat Romanski, Yeshim Deniz, Zakia Bouachraoui, Liz McMillan

Related Topics: @CloudExpo, Microservices Expo, IBM Cloud

@CloudExpo: Article

Reference Architecture for Cloud Computing

IBM releases second version of its cloud reference architecture

Admittedly, when I was heads-down in code earlier in my career, I did not pay much attention to reference architectures. We had our own internal architectures that served as ‘the way and the truth', and reference architectures for our product or solution domain were simply out of scope.  Anyway, reference architectures are, by design, not detailed enough to steer someone implementing one out of hundreds of components that will fall under said architectures. So, for the most part I ignored them, even though I could hear rumblings coming from rooms full of folks arguing over revision 25 of the reference architecture for some problem domain or another.

Fast forward a few years to a change of professional venue, and my outlook on reference architectures is a good deal different. If I were still developing, I'm sure my outlook would be much the same. However, talking with users on a frequent basis has made me aware that such architectures and solution domain overviews can be of great value to both buyers and providers. For buyers, reference architectures can help to orient them in a particular domain, and they can guide implementation and buying strategies. For providers, reference architectures serve to clearly communicate their outlook on a particular domain to both the buyers and broader market. Put simply, reference architectures serve both sides of the coin.

Now that's not to say that reference architectures come without their detractors. There are always those that stand ready to point out holes and biases in a particular provider's reference architecture. In fact, some seem to completely write off reference architectures as an instrument of marketing. In my opinion, some of these complaints are without merit and a bit overly cynical. Other complaints rise above typical inter-vendor sniping and actually point out valid holes, oversights, and biases with a particular provider's architecture. Open discourse and communication is good. In that light, I was glad to see IBM publish the second version of its cloud computing reference architecture to the Open Group earlier this week.

The document, which you can download here, explains the reference architecture in detail, but I want to look at the major highlights. To start, let's consider the high-level diagram for the architecture:

As you can see, the architecture orients itself around user roles for cloud computing. On either end, you have the cloud service creator and cloud service consumer. As its name implies, the cloud service creator role includes any type of cloud service creation tools. These tools include software development environments, virtual image development tools, process choreographing solutions, and anything else a developer may use to create services for the cloud.

On the other side of the architecture, the cloud service consumer comes into focus. As you well know, in a cloud environment there are many potential service consumers. The architecture above accounts for in-house IT as well as cloud service integration tools as consumers. There are countless more, but just with these you can begin to appreciate the challenge of effectively enabling the ‘consumer.' This requires self-service portals, service catalogs, automation capability, federated security, federated connectivity, and more. It is certainly no small task.

Finally, in the middle of the diagram, we have perhaps the most complex role, the cloud service provider. This section builds on top of a shared, usually virtualized infrastructure to address two basic facets for providers: services and service management. From a services perspective, we see the trinity of the cloud (IaaS, PaaS, SaaS), with an added wrinkle, Business Process as a Service. As the diagram acknowledges, existing services and partner services will nearly always augment these services, thereby implying the need for tools that provide both functional and non-functional integration capabilities.

Opposite the services, we see the common management framework that divides into two major categories: Operational Support Services (OSS) and Business Support Services (BSS). Naturally, the OSS accounts for those capabilities that a provider needs to effectively operate a cloud environment. This includes provisioning, monitoring, license management, service lifecycle management, and a slew of other considerations. BSS outlines the capabilities providers need to support the business requirements of cloud, and this includes pricing, metering, billing, order management, order fulfillment, and more.

Of course, there are non-functional requirements that span all three roles including security, performance, resiliency, consumability, and governance. Thus, these wrap the three major roles in the reference architecture shown above.

I know there will be some that disagree with certain elements of this reference architecture, but that is good and healthy. For those that have strong opinions on this subject (one way or another), I encourage you to get involved. That is the benefit of this being in the Open Group. You can download the reference architecture, review it at your leisure, and then discuss and influence change via the mailing list discussion. In other words, speak up!

More Stories By Dustin Amrhein

Dustin Amrhein joined IBM as a member of the development team for WebSphere Application Server. While in that position, he worked on the development of Web services infrastructure and Web services programming models. In his current role, Dustin is a technical specialist for cloud, mobile, and data grid technology in IBM's WebSphere portfolio. He blogs at http://dustinamrhein.ulitzer.com. You can follow him on Twitter at http://twitter.com/damrhein.

CloudEXPO Stories
"Cloud computing is certainly changing how people consume storage, how they use it, and what they use it for. It's also making people rethink how they architect their environment," stated Brad Winett, Senior Technologist for DDN Storage, in this SYS-CON.tv interview at 20th Cloud Expo, held June 6-8, 2017, at the Javits Center in New York City, NY.
Sold by Nutanix, Nutanix Mine with Veeam can be deployed in minutes and simplifies the full lifecycle of data backup operations, including on-going management, scaling and troubleshooting. The offering combines highly-efficient storage working in concert with Veeam Backup and Replication, helping customers achieve comprehensive data protection for all their workloads — virtual, physical and private cloud —to meet increasing business demands for uptime and productivity.
While the focus and objectives of IoT initiatives are many and diverse, they all share a few common attributes, and one of those is the network. Commonly, that network includes the Internet, over which there isn't any real control for performance and availability. Or is there? The current state of the art for Big Data analytics, as applied to network telemetry, offers new opportunities for improving and assuring operational integrity. In his session at @ThingsExpo, Jim Frey, Vice President of Strategic Alliances at Kentik, discussed tactics and tools to bridge the gap between IoT project teams and the network planning and operations functions that play a significant role in project success.
DevOps is often described as a combination of technology and culture. Without both, DevOps isn't complete. However, applying the culture to outdated technology is a recipe for disaster; as response times grow and connections between teams are delayed by technology, the culture will die. A Nutanix Enterprise Cloud has many benefits that provide the needed base for a true DevOps paradigm. In their Day 3 Keynote at 20th Cloud Expo, Chris Brown, a Solutions Marketing Manager at Nutanix, and Mark Lavi, a Nutanix DevOps Solution Architect, explored the ways that Nutanix technologies empower teams to react faster than ever before and connect teams in ways that were either too complex or simply impossible with traditional infrastructures.
According to the IDC InfoBrief, Sponsored by Nutanix, “Surviving and Thriving in a Multi-cloud World,” multicloud deployments are now the norm for enterprise organizations – less than 30% of customers report using single cloud environments. Most customers leverage different cloud platforms across multiple service providers. The interoperability of data and applications between these varied cloud environments is growing in importance and yet access to hybrid cloud capabilities where a single application runs across clouds remains elusive to most organizations. As companies eagerly seek out ways to make the multi cloud environment a reality, these new updates from Nutanix provide additional capabilities to streamline the implementation of their cloud services deployments.