Welcome!

@CloudExpo Authors: Pat Romanski, Elizabeth White, Liz McMillan, Kevin Benedict, Abdul Jaleel Kavungal Kunnumpurath

Related Topics: @CloudExpo

@CloudExpo: Blog Feed Post

Open Source Cloud Bits

The Role of Open Source in an Infrastructure as a Service (IaaS) Stack

Last week I got into a nice discussion on Twitter regarding the role of open source in an infrastructure as a service (IaaS) stack.  With open source cloud stacks from Eucalyptus, Cloud.com, Abiquo and others competing against proprietary source solutions from Enomaly, VMware and others, this can get fairly confusing quickly.

For clarity, here is my position on open source vs. proprietary source in this aspect of the market:  both have a role to play and natively one is not better or more advantaged than the other.  However, when you get into the details there are factors that might favor one model over the other in specific cases. I will look at this from the perspective of the service providers and enterprises who use cloud stacks.  In a future post I may touch on factors that vendors should when choosing between open source and closed source models.

For service providers, margins are critical.  Any increase in capital and operating costs must enable a corresponding increase in value provided in the market.  Amazon and Google have the scale and ability to build a lot of capabilities from scratch, trading a short-term increase in R&D against a long-term decrease in operating costs.

While some cloud providers may attempt to match the low-cost giants on pricing, they know that they need to differentiate in some other material way (e.g. performance, customer service, etc.).   For these providers, the more “free open source” technology that they can leverage, the lower their operating costs may be.

This low-cost focus must permeate their decision making, from the physical infrastructure (commodity servers, JBOD/DAS storage, etc.) to the hypervisor (Xen or KVM vs. VMware), to the cloud provisioning/automation layer, and more.  Open source CMDBs (example), monitoring technologies (e.g. Nagios) and other technologies are often found in these environments.

There are trade-offs, of course.  Open source can often be more difficult to use, lack key functionality, or suffer from poor support – all of which increases costs in often material and unintended ways (note that proprietary solutions can have many of the same issues, and do more often than most people realize).

Other service providers may target the enterprise and focus on highly-differentiated offerings (though I really haven’t see much differentiation yet, at least at the IaaS level).  For these providers, the benefits of enterprise-grade storage (EMC, NetApp, HP), VMware’s HA and fault-tolerant capabilities, and other capabilities gained from using tools from HP, IBM, BMC and other vendors, may be well worth the increase in cost.  And make no mistake, the cost increase from using these technologies can be quite substantial.

Newer vendors, such as Enomaly, are having some success despite their closed-source nature (Enomaly started as open source but changed models in 2009).  Further, even when a provider uses a solution from Cloud.com or Abiquo, both of them with open source models, they will often choose to pay for premium editions in order to get functionality or support not available via open source.  In reality, anybody serious about this market will want a mix of open-source (though not necessarily free) and closed-source technologies in their environment.

In the enterprise, the story is a bit different.  If you’re already paying VMware for an all-you-can-eat enterprise license agreement (ELA), the marginal cost to use vSphere in your private cloud is zero.  KVM or Xen are not less expensive in this case.  Same is true for tools from HP, IBM, BMC and others.

The primary question, then, is whether or not these are the right solutions.  Does BMC have a better answer for private clouds than Eucalyptus?  Is IBM CloudBurst better than Abiquo for development and test?

Open source for open source’s sake is not rational.

In addition, focusing on only the economics of open source misses what might be the bigger value – risk reduction.  Closed-source projects can go under – either because the developer goes out of business, or if an acquirer decides to no longer keep a product on the market.  This does happen all of the time.  For large and well-established technologies, the risk of abandonment is generally lower.  VMware, HP and EMC are not going anywhere soon.

Open source projects, in contrast, can always be continued.  The cost may fall to those dependent on the project, but at least you get the option.  Not so with closed source – especially if the solution is killed by its owner.

Most buyers can get source code escrow terms that give them access to the source for a product in the event of bankruptcy or similar situations.  In 20 years I have not seen a source escrow addendum include a trigger to release the code if the developer stops or slows investing in it.  Today your vendor might have 20 top-tier developers delivering on a roadmap.  What if in 3 years they have only 4 folks maintaining the current code line and making minor updates?  Can I get the source code then?  Typically not.

There’s another issue that often gets overlooked.  Even if you have a source escrow agreement, that doesn’t mean that the code deposits are being made on a regular basis.  It also doesn’t mean that the code is well-commented or that accurate build scripts are included such that a person of “commercially reasonable” skill can take over the code and move it forward.  I have seen this situation happen more than once, including recently, and it’s quite a shock to learn that your vaunted supplier has been careless, lazy, or even deliberately misleading about their source code responsibilities.

CloudBzz Recommendations

1.  Insist on open source (or at least full source access – not escrow) when one or more of the following situations exist:

- the supplier is small or thinly funded (VCs can and do pull the plug even after many million$ have been invested)
- the capability/functionality provided by the technology is strategically important to you, especially when investment must be maintained to remain leading-edge in a fast-moving and intensely competitive market
- migration costs to a different technology are very high and disruptive

2.  Consider closed-source/proprietary solutions when at least two or more of the following factors are present:

- the functionality provided by the software is not core to your competitive positioning the market
- replacement costs (particularly internal change costs) are moderate or low
- the functionality and value is so much higher than open source alternatives that you’re willing to take the risk
- the technology is so widely deployed and successful that the risks of abandonment is very low
- the costs are low enough so as not to make your offering uncompetitive or internal environment unaffordable

Balancing risk, capability and control is very difficult – even more so in a young and emerging market like cloud computing.  The decisions made in haste today can have a profound impact on your success in the future – especially if you are a cloud service provider.

While open source can be a very potent source of competitive advantage, it should not be adopted purely on philosophical grounds.  If you do adopt closed source, especially at the core stack level, work hard to aggressively manage your exposure and make sure you work hard to ensure that those “unforeseen events” don’t leave you high and dry.

Read the original blog entry...

More Stories By John Treadway

John Treadway is a Vice President at Cloud Technology Partners and has over 20 years of experience delivering technology and business solutions to domestic and global enterprises across multiple industries and sectors. As a senior enterprise technology and services executive, he has a successful track record of leading strategic cloud computing and data center initiatives. John is responsible for technology IP at Cloud Technology Partners, and is actively involved with client projects and strategic alliances. John is also an active blogger in the cloud computing space and authors the CloudBzz blog. Sites/Blogs CloudBzz

@CloudExpo Stories
"When we talk about cloud without compromise what we're talking about is that when people think about 'I need the flexibility of the cloud' - it's the ability to create applications and run them in a cloud environment that's far more flexible,” explained Matthew Finnie, CTO of Interoute, in this SYS-CON.tv interview at 20th Cloud Expo, held June 6-8, 2017, at the Javits Center in New York City, NY.
The Internet giants are fully embracing AI. All the services they offer to their customers are aimed at drawing a map of the world with the data they get. The AIs from these companies are used to build disruptive approaches that cannot be used by established enterprises, which are threatened by these disruptions. However, most leaders underestimate the effect this will have on their businesses. In his session at 21st Cloud Expo, Rene Buest, Director Market Research & Technology Evangelism at Ara...
No hype cycles or predictions of zillions of things here. IoT is big. You get it. You know your business and have great ideas for a business transformation strategy. What comes next? Time to make it happen. In his session at @ThingsExpo, Jay Mason, Associate Partner at M&S Consulting, presented a step-by-step plan to develop your technology implementation strategy. He discussed the evaluation of communication standards and IoT messaging protocols, data analytics considerations, edge-to-cloud tec...
When growing capacity and power in the data center, the architectural trade-offs between server scale-up vs. scale-out continue to be debated. Both approaches are valid: scale-out adds multiple, smaller servers running in a distributed computing model, while scale-up adds fewer, more powerful servers that are capable of running larger workloads. It’s worth noting that there are additional, unique advantages that scale-up architectures offer. One big advantage is large memory and compute capacity...
New competitors, disruptive technologies, and growing expectations are pushing every business to both adopt and deliver new digital services. This ‘Digital Transformation’ demands rapid delivery and continuous iteration of new competitive services via multiple channels, which in turn demands new service delivery techniques – including DevOps. In this power panel at @DevOpsSummit 20th Cloud Expo, moderated by DevOps Conference Co-Chair Andi Mann, panelists examined how DevOps helps to meet the de...
"Loom is applying artificial intelligence and machine learning into the entire log analysis process, from start to finish and at the end you will get a human touch,” explained Sabo Taylor Diab, Vice President, Marketing at Loom Systems, in this SYS-CON.tv interview at 20th Cloud Expo, held June 6-8, 2017, at the Javits Center in New York City, NY.
A look across the tech landscape at the disruptive technologies that are increasing in prominence and speculate as to which will be most impactful for communications – namely, AI and Cloud Computing. In his session at 20th Cloud Expo, Curtis Peterson, VP of Operations at RingCentral, highlighted the current challenges of these transformative technologies and shared strategies for preparing your organization for these changes. This “view from the top” outlined the latest trends and developments i...
With major technology companies and startups seriously embracing Cloud strategies, now is the perfect time to attend 21st Cloud Expo October 31 - November 2, 2017, at the Santa Clara Convention Center, CA, and June 12-14, 2018, at the Javits Center in New York City, NY, and learn what is going on, contribute to the discussions, and ensure that your enterprise is on the right path to Digital Transformation.
@DevOpsSummit at Cloud Expo taking place Oct 31 - Nov 2, 2017, at the Santa Clara Convention Center, Santa Clara, CA, is co-located with the 21st International Cloud Expo and will feature technical sessions from a rock star conference faculty and the leading industry players in the world. The widespread success of cloud computing is driving the DevOps revolution in enterprise IT. Now as never before, development teams must communicate and collaborate in a dynamic, 24/7/365 environment. There is ...
The financial services market is one of the most data-driven industries in the world, yet it’s bogged down by legacy CPU technologies that simply can’t keep up with the task of querying and visualizing billions of records. In his session at 20th Cloud Expo, Karthik Lalithraj, a Principal Solutions Architect at Kinetica, discussed how the advent of advanced in-database analytics on the GPU makes it possible to run sophisticated data science workloads on the same database that is housing the rich...
What's the role of an IT self-service portal when you get to continuous delivery and Infrastructure as Code? This general session showed how to create the continuous delivery culture and eight accelerators for leading the change. Don Demcsak is a DevOps and Cloud Native Modernization Principal for Dell EMC based out of New Jersey. He is a former, long time, Microsoft Most Valuable Professional, specializing in building and architecting Application Delivery Pipelines for hybrid legacy, and cloud ...
Amazon started as an online bookseller 20 years ago. Since then, it has evolved into a technology juggernaut that has disrupted multiple markets and industries and touches many aspects of our lives. It is a relentless technology and business model innovator driving disruption throughout numerous ecosystems. Amazon’s AWS revenues alone are approaching $16B a year making it one of the largest IT companies in the world. With dominant offerings in Cloud, IoT, eCommerce, Big Data, AI, Digital Assista...
Artificial intelligence, machine learning, neural networks. We’re in the midst of a wave of excitement around AI such as hasn’t been seen for a few decades. But those previous periods of inflated expectations led to troughs of disappointment. Will this time be different? Most likely. Applications of AI such as predictive analytics are already decreasing costs and improving reliability of industrial machinery. Furthermore, the funding and research going into AI now comes from a wide range of com...
For organizations that have amassed large sums of software complexity, taking a microservices approach is the first step toward DevOps and continuous improvement / development. Integrating system-level analysis with microservices makes it easier to change and add functionality to applications at any time without the increase of risk. Before you start big transformation projects or a cloud migration, make sure these changes won’t take down your entire organization.
We build IoT infrastructure products - when you have to integrate different devices, different systems and cloud you have to build an application to do that but we eliminate the need to build an application. Our products can integrate any device, any system, any cloud regardless of protocol," explained Peter Jung, Chief Product Officer at Pulzze Systems, in this SYS-CON.tv interview at @ThingsExpo, held November 1-3, 2016, at the Santa Clara Convention Center in Santa Clara, CA
Internet of @ThingsExpo, taking place October 31 - November 2, 2017, at the Santa Clara Convention Center in Santa Clara, CA, is co-located with 21st Cloud Expo and will feature technical sessions from a rock star conference faculty and the leading industry players in the world. The Internet of Things (IoT) is the most profound change in personal and enterprise IT since the creation of the Worldwide Web more than 20 years ago. All major researchers estimate there will be tens of billions devic...
21st International Cloud Expo, taking place October 31 - November 2, 2017, at the Santa Clara Convention Center in Santa Clara, CA, will feature technical sessions from a rock star conference faculty and the leading industry players in the world. Cloud computing is now being embraced by a majority of enterprises of all sizes. Yesterday's debate about public vs. private has transformed into the reality of hybrid cloud: a recent survey shows that 74% of enterprises have a hybrid cloud strategy. Me...
With major technology companies and startups seriously embracing Cloud strategies, now is the perfect time to attend 21st Cloud Expo October 31 - November 2, 2017, at the Santa Clara Convention Center, CA, and June 12-14, 2018, at the Javits Center in New York City, NY, and learn what is going on, contribute to the discussions, and ensure that your enterprise is on the right path to Digital Transformation.
SYS-CON Events announced today that Ayehu will exhibit at SYS-CON's 21st International Cloud Expo®, which will take place on October 31 - November 2, 2017 at the Santa Clara Convention Center in Santa Clara California. Ayehu provides IT Process Automation & Orchestration solutions for IT and Security professionals to identify and resolve critical incidents and enable rapid containment, eradication, and recovery from cyber security breaches. Ayehu provides customers greater control over IT infras...
SYS-CON Events announced today that MobiDev, a client-oriented software development company, will exhibit at SYS-CON's 21st International Cloud Expo®, which will take place October 31-November 2, 2017, at the Santa Clara Convention Center in Santa Clara, CA. MobiDev is a software company that develops and delivers turn-key mobile apps, websites, web services, and complex software systems for startups and enterprises. Since 2009 it has grown from a small group of passionate engineers and business...