Welcome!

@CloudExpo Authors: Pat Romanski, Yeshim Deniz, Elizabeth White, Liz McMillan, Gopala Krishna Behara

Related Topics: @CloudExpo, Microservices Expo

@CloudExpo: Blog Feed Post

Cloud Brokers Make the Cloud Fit for Enterprise Requirements

Sometimes it’s fun to look back at your predictions to remember what you were thinking at the time

Sometimes it’s fun to look back at your predictions to remember what you were thinking at the time and see how accurate you turned out to be. Based on some recent conversations, I decided to revisit one of our early blog posts from 2009, where we were envisioning the direction this industry would take and the role our technology would play in it.

That post, Dynamic Cloud Fitting — the Future in Automated Cloud Management, described a world where workloads would move automatically to the right environment to meet a customer’s business and technical requirements. It also explored how an entity called a “cloud broker” (initially defined by Gartner in 2009) would provide the technology and expertise to achieve that goal.

Fast forward to 2011. The idea of workloads being redirecting on the fly across different clouds for real-time optimized cost/performance is still a vision for the distant future – both because the technology is not yet available and also because customers have shown little interest to date. However the main concept of a cloud broker “fitting” a workload to a cloud environment based on technical and business requirements turns out to be very important to customers, and is already possible today. By providing an intermediation layer spanning multiple clouds, cloud broker software from companies like CloudSwitch can provide a range of capabilities beyond the scope of an individual cloud provider or service.

In terms of cloud “fitting,” what customers want is the ability to set parameters on a number of dimensions in order to control usage and optimize workload performance. These parameters include (but are certainly not limited to):

  • Which cloud services they want to make available (and for which users)
  • Which geographic locations (regions, zones, data centers)
  • Cost limitations – per hour, based on quotas, etc.
  • Maximum latency that can be tolerated
  • Virtual machine requirements for CPU, memory, etc.
  • Maximum provisioning time that is acceptable
  • Minimum SLA required for reasonable availability

What’s clear is that cloud broker software must incorporate an algorithmic approach to mapping the requirements of the enterprise, user groups, individual users, and specific workloads against the possible cloud services that have been enabled. This is a non-trivial process and is based on capturing and tracking a mix of inputs from administrators and users, as well as real-time data from the virtual machines, networks, and cloud providers. Think of this as being somewhat similar to recommendation engines on websites like kayak.com that help “fit” travelers’ requirements and preferences against available flights. Only in this case, the “flights” are instances that can be provided by one or more cloud provider or by internal virtualized resources.

Another important aspect for the cloud broker software is to implement the “fitting” algorithm in the context of a role-based access control (RBAC) system. Think of this in terms of layers of enterprise controls and permissions that guide users’ options for self-service access to cloud resources. For example, a global administrator may set up the initial constraints based on which cloud services are available for the entire enterprise, while a business unit administrator may have more narrow limits for her users based on quotas, geographic constraints for certain teams, etc. – and the final end-user just wants to get his work done quickly and cost-effectively without worrying about any of this.

One point we didn’t foresee in our original blog post on cloud fitting was how the cloud broker’s role would expand. In addition to on-boarding applications into the cloud, customers now look to cloud brokers to fill important gaps in areas that cloud providers either don’t want to deliver (such as multi-cloud capability) or that are hard to deliver because of architectural limitations.

Security is a good example of the latter, where the shared environment of the cloud makes it hard to give individual customers control over encryption and key management, something that enterprises frequently require to get CSO sign-off. Extension of network configurations into the cloud with full configurability is also challenging for most cloud providers, since their network architectures are by definition fairly “flattened” and limited in options like multiple sub-nets (unless the customer is willing to pay for a dedicated network setup). This is another area where cloud brokers can help bridge the gap between what enterprise users need and what multiple cloud providers can deliver.

So the role of a cloud broker turns out to be evolving and growing broader over time, and no doubt will continue to do so. There’s a broad consensus that the broker’s role is important — not just here at CloudSwitch, but also among industry analysts like Gartner, other technology vendors, and our enterprise customers. The key insight is that cloud brokers allow enterprises to extend their control over their applications and data into the cloud. This ability to put control in the hands of the customer is what matters the most.

Read the original blog entry...

More Stories By Ellen Rubin

Ellen Rubin is the CEO and co-founder of ClearSky Data, an enterprise storage company that recently raised $27 million in a Series B investment round. She is an experienced entrepreneur with a record in leading strategy, market positioning and go-to- market efforts for fast-growing companies. Most recently, she was co-founder of CloudSwitch, a cloud enablement software company, acquired by Verizon in 2011. Prior to founding CloudSwitch, Ellen was the vice president of marketing at Netezza, where as a member of the early management team, she helped grow the company to more than $130 million in revenues and a successful IPO in 2007. Ellen holds an MBA from Harvard Business School and an undergraduate degree magna cum laude from Harvard University.

CloudEXPO Stories
Charles Araujo is an industry analyst, internationally recognized authority on the Digital Enterprise and author of The Quantum Age of IT: Why Everything You Know About IT is About to Change. As Principal Analyst with Intellyx, he writes, speaks and advises organizations on how to navigate through this time of disruption. He is also the founder of The Institute for Digital Transformation and a sought after keynote speaker. He has been a regular contributor to both InformationWeek and CIO Insight and has been quoted or published in Time, CIO, Computerworld, USA Today and Forbes.
In this presentation, you will learn first hand what works and what doesn't while architecting and deploying OpenStack. Some of the topics will include:- best practices for creating repeatable deployments of OpenStack- multi-site considerations- how to customize OpenStack to integrate with your existing systems and security best practices.
DXWorldEXPO LLC announced today that Kevin Jackson joined the faculty of CloudEXPO's "10-Year Anniversary Event" which will take place on November 11-13, 2018 in New York City. Kevin L. Jackson is a globally recognized cloud computing expert and Founder/Author of the award winning "Cloud Musings" blog. Mr. Jackson has also been recognized as a "Top 100 Cybersecurity Influencer and Brand" by Onalytica (2015), a Huffington Post "Top 100 Cloud Computing Experts on Twitter" (2013) and a "Top 50 Cloud Computing Blogger for IT Integrators" by CRN (2015). Mr. Jackson's professional career includes service in the US Navy Space Systems Command, Vice President J.P. Morgan Chase, Worldwide Sales Executive for IBM and NJVC Vice President, Cloud Services. He is currently part of a team responsible for onboarding mission applications to the US Intelligence Community cloud computing environment (IC ...
Bill Schmarzo, Tech Chair of "Big Data | Analytics" of upcoming CloudEXPO | DXWorldEXPO New York (November 12-13, 2018, New York City) today announced the outline and schedule of the track. "The track has been designed in experience/degree order," said Schmarzo. "So, that folks who attend the entire track can leave the conference with some of the skills necessary to get their work done when they get back to their offices. It actually ties back to some work that I'm doing at the University of San Francisco which creates an "Outcomes-Centric Business Analytics" degree." Bill Schmarzo, author of "Big Data: Understanding How Data Powers Big Business" and "Big Data MBA: Driving Business Strategies with Data Science" is responsible for guiding the technology strategy within Hitachi Vantara for IoT and Analytics. Bill brings a balanced business-technology approach that focuses on business...
The now mainstream platform changes stemming from the first Internet boom brought many changes but didn’t really change the basic relationship between servers and the applications running on them. In fact, that was sort of the point. In his session at 18th Cloud Expo, Gordon Haff, senior cloud strategy marketing and evangelism manager at Red Hat, will discuss how today’s workloads require a new model and a new platform for development and execution. The platform must handle a wide range of recent developments, including containers and Docker, distributed resource management, and DevOps tool chains and processes. The resulting infrastructure and management framework must be optimized for distributed and scalable applications, take advantage of innovation stemming from a wide variety of open source projects, span hybrid environments, and be adaptable to equally fundamental changes happen...