|By John Cowan||
|January 15, 2012 08:12 PM EST||
Abstract: Cloud Brokerage is an emerging trend in the broader cloud computing industry. Opinions differ widely about what it means to be a broker and the significance brokers will have on the future of the industry as a whole. The reality is that the brokerage model signals the real potential to commoditize the compute utility, which will climax with the genesis of compute as a tradable commodity like soybeans, oil or minerals. In this four part series we will take a deep dive into the concept of cloud brokerage and connect the dots between the key trends and market demands that will shape a force few in the industry see coming and fewer still are prepared to accept.
Part I: The Analysts Weigh In
The National Institute of Standards and Technology (NIST) has established a working paper on the subject of cloud brokerage, signaling the importance of a movement that is taking shape inside the cloud computing industry as a whole.
The NIST working document describes the Cloud Brokerage success criteria as follows:
A cloud-user wishes to carry out an action on cloud-provider-1 using a federated interface, with no direct knowledge of cloud-provider-1 commands or interfaces. A cloud-management-broker offers the cloud-user a federated interface to multiple cloud-providers through a human user interface, an application programming interface or both. The cloud-user selects desired cloud-provider-1 resources, action and action parameters using the cloud-management-broker interface. The cloud-management-broker collects and marshals the selected action and parameters from the cloud-user‘s selection and issues the desired command to cloud-provider-1 using cloud-provider-1 native interface.
The idea of cloud brokerage warranted enough noise to be covered in detail within the analyst community in 2011. However, depending with whom you subscribe, cloud brokerage has very different meanings. While there has been definite progress on the behalf of the analyst community, I think the potential for what this model could mean for the cloud computing market goes much deeper.
To be clear, I believe the role of what I am calling the “infrastructure broker” will be the most significant movement in the computing industry since the advent of virtualization and cloud.
Before I get into the immense complexities of that statement, let’s take a look a few perspectives from key industry analysts:
According to Gartner research expert Benoit Lheureux, the role of the Cloud Service Broker (CSB) is to “aggregate and add value to cloud services by providing a single point of entry to different types of cloud services.” Gartner goes on to illustrate some key defining characteristics of a cloud broker. According to Gartner, a CSB is a CSB if they genuinely perform:
- Aggregation across VARS and IT Distributors
- Integration with Systems Integrators
- Customization for SI’s and Professional Services organizations
Gartner’s definition of Cloud Brokerage is by far the lightest among the analysts. If you believe Lheureux, Cloud Brokerage is really just the modernization of the IT channel.
451 generally consider the category of CSB a part of broader market called cloud on-ramps. In addition to providing some sort of provisioning technology, CSBs differ “in that they provide a value-added economic function, which matches workloads to the best execution venues.”
While I think 451 only provides cursory attention to cloud brokerage as a concept, the are at least more directionally correct in the sense that they see Brokerage providing a level of sophistication that is unique in the delivery of cloud services – namely the concept of ‘workload matching’.
I think Forrester has done the best job among the research outfits when it comes to taking a seriously deep look at the CSB market definition. Forrester sees the CSBplaying a pivotal role in the future of the entire industry. Analyst Stefan Reid’s taxonomy picture does a fantastic job of identifying the interaction of different players.
According to Forrester, “the simple broker model gains value only by comparing similar cloud provider options and using dynamic provisioning based on the actual spot prices of these resources.” This sounds similar to 451 and Gartner in direction and tone.
But Forrester goes on to elaborate on what they see the as the evolution of the brokerage model. “The full broker [model] goes far beyond [the simple broker]. It uses “cloud bursting” to provide IT users with higher value for a lower price.” Cloud Bursting, Forrester explains, “is the dynamic relocation of workloads from private environments to cloud providers and vice versa.” I’ll admit a slight sigh when I hear the cloud ‘bursting’ term (again), but I think Forrester has a great grasp on the technical role of the broker.
Gartner sees brokering as little more than modern distribution. 451 sees the concept as something they instinctually must cover but the details are hazy. Forrester has obviously put the most thought into their analysis. But the consistent underlying theme within these analysts is that brokerage insinuates a model whereby vendors inserting themselves and their technology between supplier and consumer to provide a layer of transactional value.
The debate and discussion goes much deeper than this and the potential for the cloud broker is much more profound.
In Part II of this post we will take a closer look at the role of the intermediary and who is likely to take up this position in the market.
Organizations planning enterprise data center consolidation and modernization projects are faced with a challenging, costly reality. Requirements to deploy modern, cloud-native applications simultaneously with traditional client/server applications are almost impossible to achieve with hardware-centric enterprise infrastructure. Compute and network infrastructure are fast moving down a software-defined path, but storage has been a laggard. Until now.
Jul. 25, 2016 10:15 PM EDT Reads: 1,662
Continuous testing helps bridge the gap between developing quickly and maintaining high quality products. But to implement continuous testing, CTOs must take a strategic approach to building a testing infrastructure and toolset that empowers their team to move fast. Download our guide to laying the groundwork for a scalable continuous testing strategy.
Jul. 25, 2016 10:15 PM EDT Reads: 1,917
Let’s face it, embracing new storage technologies, capabilities and upgrading to new hardware often adds complexity and increases costs. In his session at 18th Cloud Expo, Seth Oxenhorn, Vice President of Business Development & Alliances at FalconStor, discussed how a truly heterogeneous software-defined storage approach can add value to legacy platforms and heterogeneous environments. The result reduces complexity, significantly lowers cost, and provides IT organizations with improved efficienc...
Jul. 25, 2016 10:00 PM EDT Reads: 1,939
Internet of @ThingsExpo, taking place November 1-3, 2016, at the Santa Clara Convention Center in Santa Clara, CA, is co-located with the 19th International Cloud Expo and will feature technical sessions from a rock star conference faculty and the leading industry players in the world and ThingsExpo Silicon Valley Call for Papers is now open.
Jul. 25, 2016 10:00 PM EDT Reads: 2,497
The IoT is changing the way enterprises conduct business. In his session at @ThingsExpo, Eric Hoffman, Vice President at EastBanc Technologies, discussed how businesses can gain an edge over competitors by empowering consumers to take control through IoT. He cited examples such as a Washington, D.C.-based sports club that leveraged IoT and the cloud to develop a comprehensive booking system. He also highlighted how IoT can revitalize and restore outdated business models, making them profitable ...
Jul. 25, 2016 08:30 PM EDT Reads: 1,931
The cloud competition for database hosts is fierce. How do you evaluate a cloud provider for your database platform? In his session at 18th Cloud Expo, Chris Presley, a Solutions Architect at Pythian, gave users a checklist of considerations when choosing a provider. Chris Presley is a Solutions Architect at Pythian. He loves order – making him a premier Microsoft SQL Server expert. Not only has he programmed and administered SQL Server, but he has also shared his expertise and passion with b...
Jul. 25, 2016 08:00 PM EDT Reads: 1,915
With 15% of enterprises adopting a hybrid IT strategy, you need to set a plan to integrate hybrid cloud throughout your infrastructure. In his session at 18th Cloud Expo, Steven Dreher, Director of Solutions Architecture at Green House Data, discussed how to plan for shifting resource requirements, overcome challenges, and implement hybrid IT alongside your existing data center assets. Highlights included anticipating workload, cost and resource calculations, integrating services on both sides...
Jul. 25, 2016 08:00 PM EDT Reads: 1,965
Big Data engines are powering a lot of service businesses right now. Data is collected from users from wearable technologies, web behaviors, purchase behavior as well as several arbitrary data points we’d never think of. The demand for faster and bigger engines to crunch and serve up the data to services is growing exponentially. You see a LOT of correlation between “Cloud” and “Big Data” but on Big Data and “Hybrid,” where hybrid hosting is the sanest approach to the Big Data Infrastructure pro...
Jul. 25, 2016 07:30 PM EDT Reads: 1,881
"We are a well-established player in the application life cycle management market and we also have a very strong version control product," stated Flint Brenton, CEO of CollabNet,, in this SYS-CON.tv interview at 18th Cloud Expo, held June 7-9, 2016, at the Javits Center in New York City, NY.
Jul. 25, 2016 07:15 PM EDT Reads: 1,794
We all know the latest numbers: Gartner, Inc. forecasts that 6.4 billion connected things will be in use worldwide in 2016, up 30 percent from last year, and will reach 20.8 billion by 2020. We're rapidly approaching a data production of 40 zettabytes a day – more than we can every physically store, and exabytes and yottabytes are just around the corner. For many that’s a good sign, as data has been proven to equal money – IF it’s ingested, integrated, and analyzed fast enough. Without real-ti...
Jul. 25, 2016 07:15 PM EDT Reads: 1,005
In his session at 18th Cloud Expo, Sagi Brody, Chief Technology Officer at Webair Internet Development Inc., and Logan Best, Infrastructure & Network Engineer at Webair, focused on real world deployments of DDoS mitigation strategies in every layer of the network. He gave an overview of methods to prevent these attacks and best practices on how to provide protection in complex cloud platforms. He also outlined what we have found in our experience managing and running thousands of Linux and Unix ...
Jul. 25, 2016 07:15 PM EDT Reads: 1,746
In his session at @DevOpsSummit at 19th Cloud Expo, Yoseph Reuveni, Director of Software Engineering at Jet.com, will discuss Jet.com's journey into containerizing Microsoft-based technologies like C# and F# into Docker. He will talk about lessons learned and challenges faced, the Mono framework tryout and how they deployed everything into Azure cloud. Yoseph Reuveni is a technology leader with unique experience developing and running high throughput (over 1M tps) distributed systems with extre...
Jul. 25, 2016 07:15 PM EDT Reads: 2,087
"Operations is sort of the maturation of cloud utilization and the move to the cloud," explained Steve Anderson, Product Manager for BMC’s Cloud Lifecycle Management, in this SYS-CON.tv interview at 18th Cloud Expo, held June 7-9, 2016, at the Javits Center in New York City, NY.
Jul. 25, 2016 07:00 PM EDT Reads: 1,881
A critical component of any IoT project is what to do with all the data being generated. This data needs to be captured, processed, structured, and stored in a way to facilitate different kinds of queries. Traditional data warehouse and analytical systems are mature technologies that can be used to handle certain kinds of queries, but they are not always well suited to many problems, particularly when there is a need for real-time insights.
Jul. 25, 2016 06:15 PM EDT Reads: 1,751
UpGuard has become a member of the Center for Internet Security (CIS), and will continue to help businesses expand visibility into their cyber risk by providing hardening benchmarks to all customers. By incorporating these benchmarks, UpGuard's CSTAR solution builds on its lead in providing the most complete assessment of both internal and external cyber risk. CIS benchmarks are a widely accepted set of hardening guidelines that have been publicly available for years. Numerous solutions exist t...
Jul. 25, 2016 06:15 PM EDT Reads: 453
We're entering the post-smartphone era, where wearable gadgets from watches and fitness bands to glasses and health aids will power the next technological revolution. With mass adoption of wearable devices comes a new data ecosystem that must be protected. Wearables open new pathways that facilitate the tracking, sharing and storing of consumers’ personal health, location and daily activity data. Consumers have some idea of the data these devices capture, but most don’t realize how revealing and...
Jul. 25, 2016 06:00 PM EDT Reads: 2,077
"Software-defined storage is a big problem in this industry because so many people have different definitions as they see fit to use it," stated Peter McCallum, VP of Datacenter Solutions at FalconStor Software, in this SYS-CON.tv interview at 18th Cloud Expo, held June 7-9, 2016, at the Javits Center in New York City, NY.
Jul. 25, 2016 06:00 PM EDT Reads: 1,429
Unless your company can spend a lot of money on new technology, re-engineering your environment and hiring a comprehensive cybersecurity team, you will most likely move to the cloud or seek external service partnerships. In his session at 18th Cloud Expo, Darren Guccione, CEO of Keeper Security, revealed what you need to know when it comes to encryption in the cloud.
Jul. 25, 2016 06:00 PM EDT Reads: 2,431
Actian Corporation has announced the latest version of the Actian Vector in Hadoop (VectorH) database, generally available at the end of July. VectorH is based on the same query engine that powers Actian Vector, which recently doubled the TPC-H benchmark record for non-clustered systems at the 3000GB scale factor (see tpc.org/3323). The ability to easily ingest information from different data sources and rapidly develop queries to make better business decisions is becoming increasingly importan...
Jul. 25, 2016 05:30 PM EDT Reads: 776
Using new techniques of information modeling, indexing, and processing, new cloud-based systems can support cloud-based workloads previously not possible for high-throughput insurance, banking, and case-based applications. In his session at 18th Cloud Expo, John Newton, CTO, Founder and Chairman of Alfresco, described how to scale cloud-based content management repositories to store, manage, and retrieve billions of documents and related information with fast and linear scalability. He addres...
Jul. 25, 2016 04:00 PM EDT Reads: 1,008