Welcome!

@CloudExpo Authors: Yeshim Deniz, Pat Romanski, Jason Bloomberg, Zakia Bouachraoui, Elizabeth White

Related Topics: @CloudExpo, Open Source Cloud, @DXWorldExpo

@CloudExpo: Blog Post

Exploding Open Source — CIO Lessons from the Edge By @ABridgwater | @CloudExpo #Cloud #BigData

We now have a variety of established frameworks & bodies dedicated to higher-level formalization & codification of open source

The combined notions of open source and the ‘community contribution' model of collaborative software application development are, of course, not new.

The history of open source is actually traced back to early software exchanges between universities driven by academic principles of knowledge sharing in the 1960s. Sometime afterwards (August 25, 1991), Finnish computer scientist Linus Torvalds created Linux... and the rest is history.

But this is not a history lesson; this is an examination of current realities.

Formalization and codification
We now have a variety of established frameworks and bodies dedicated to the higher-level formalization and codification of open source. Names such as The Apache Foundation's Hadoop, OpenStack and The Linux Foundation sit alongside Docker, Drupal and OpenDaylight in a world where these brands are as respected as any previously proprietary-only technology.

Part of this acceptance has come with time and familiarity, but the key reason for these projects' positive approval ratings has come about plain and simply as a result of their functionality, flexibility and power - combine those factors with the availability of ‘commercially licenced' versions of the open code in question - and you have a virtuous circle of positive development.

The growth of ‘commercially licenced' versions (often called Enterprise Edition by vendors) of open code projects is important as it provides locked-down iterations of the software in use.

Dealing with dynamism
Open source is characterized by its dynamism, i.e., anyone can take piece of software and modify it and submit it back to the project ‘stewards' or owners and, if the changes are good, they may make it into the next release of the software.

That's all well and good...

... but if you are running an airport air traffic control system, or creating software to run healthcare systems or medical equipment, or perhaps even software functions to drive financial markets - the need to run static (as opposed to dynamic) code is of course paramount.

Just look at the numbers and you will appreciate the size of this market:

  • The total lines of source code present today in Linux Foundation's Collaborative Projects are 115,013,302.
  • The estimated, total amount of effort required to retrace the steps of collaborative development for these projects is 41,192.25 person years.
  • In other words, it would take 1,356 developers 30 years to re-create the code bases present in Linux Foundation's current Collaborative Projects listed above. The total economic value of this work is estimated to be over $5 billion dollars.

The open source opportunity
These numbers are meant to convey one thought, i.e., the need for software intelligence is huge, but much of it is available out there if CIOs look to established, approved, well-supported, secured open source projects.

It has been called the ‘distributed genius of thousands,' i.e., there is all this Intellectual Property (IP) out there and it's available for any firm to tap into if they know how to harness it. The software is free, yes, until you get to the point where you need to pay for enterprise versions of the code and also get maintenance and support - at that point, it's the same as with any other kind of software.

What differs with open source is that firms can start to augment and modify the code they have brought in and improve specific aspects of it based upon their own functionality requirements. This comes full circle when the firm then starts to contribute those functions back to the project and other companies start to benefit from the total increase in the knowledge base.

Taking without giving
CIOs will also quickly learn that taking without giving is a bad idea. Firms who do this will fail to benefit from the wider level of positive evolution that exists in fully blown open source world. The community will provide peer review functions to help tune code, so why miss out on this?

As a specific reference for this story, look at the KPMG Analytics and Visualization Environment (KAVE), which is open source from the start. This is a modular Big Data platform that can be tailored to each customer's needs and grows along with you.

KPMG has clearly specified its reasons for choosing open source components and says it selected software on the following basis:

  • Where there was no ‘sufficiently advanced' close-source competitors
  • Where there were licenses for use by commercial and non-commercial organizations
  • Where the software exhibited ‘class-leading' performance or had/has a class-defining solution with a history of excellence
  • Where there existed good support in terms of an active user community and/or an open source contribution community
  • Where there was full horizontal scalability for immediate use in full blown Big Data environments

The higher-level commercial use of open source software is now an operational reality for firms of all sizes, now is the time to know more.

This post is sponsored by KPMG LLP and The CIO Agenda.

KPMG LLP is a Delaware limited liability partnership and is the U.S. member firm of the KPMG network of independent member firms affiliated with KPMG International Cooperative ("KPMG International"), a Swiss entity. The KPMG name, logo and "cutting through complexity" are registered trademarks or trademarks of KPMG International. The views and opinions expressed herein are those of the authors and do not necessarily represent the views and opinions of KPMG LLP.

More Stories By Adrian Bridgwater

Adrian Bridgwater is a freelance journalist and corporate content creation specialist focusing on cross platform software application development as well as all related aspects software engineering, project management and technology as a whole.

Comments (0)

Share your thoughts on this story.

Add your comment
You must be signed in to add a comment. Sign-in | Register

In accordance with our Comment Policy, we encourage comments that are on topic, relevant and to-the-point. We will remove comments that include profanity, personal attacks, racial slurs, threats of violence, or other inappropriate material that violates our Terms and Conditions, and will block users who make repeated violations. We ask all readers to expect diversity of opinion and to treat one another with dignity and respect.


CloudEXPO Stories
Lori MacVittie is a subject matter expert on emerging technology responsible for outbound evangelism across F5's entire product suite. MacVittie has extensive development and technical architecture experience in both high-tech and enterprise organizations, in addition to network and systems administration expertise. Prior to joining F5, MacVittie was an award-winning technology editor at Network Computing Magazine where she evaluated and tested application-focused technologies including app security and encryption-related solutions. She holds a B.S. in Information and Computing Science from the University of Wisconsin at Green Bay, and an M.S. in Computer Science from Nova Southeastern University, and is an O'Reilly author.
Dynatrace is an application performance management software company with products for the information technology departments and digital business owners of medium and large businesses. Building the Future of Monitoring with Artificial Intelligence. Today we can collect lots and lots of performance data. We build beautiful dashboards and even have fancy query languages to access and transform the data. Still performance data is a secret language only a couple of people understand. The more business becomes digital the more stakeholders are interested in this data including how it relates to business. Some of these people have never used a monitoring tool before. They have a question on their mind like "How is my application doing" but no idea how to get a proper answer.
Having been in the web hosting industry since 2002, dhosting has gained a great deal of experience while working on a wide range of projects. This experience has enabled the company to develop our amazing new product, which they are now excited to present! Among dHosting's greatest achievements, they can include the development of their own hosting panel, the building of their fully redundant server system, and the creation of dhHosting's unique product, Dynamic Edge.
Your job is mostly boring. Many of the IT operations tasks you perform on a day-to-day basis are repetitive and dull. Utilizing automation can improve your work life, automating away the drudgery and embracing the passion for technology that got you started in the first place. In this presentation, I'll talk about what automation is, and how to approach implementing it in the context of IT Operations. Ned will discuss keys to success in the long term and include practical real-world examples. Get started on automating your way to a brighter future!
The challenges of aggregating data from consumer-oriented devices, such as wearable technologies and smart thermostats, are fairly well-understood. However, there are a new set of challenges for IoT devices that generate megabytes or gigabytes of data per second. Certainly, the infrastructure will have to change, as those volumes of data will likely overwhelm the available bandwidth for aggregating the data into a central repository. Ochandarena discusses a whole new way to think about your next-gen applications and how to address the challenges of building applications that harness all data types and sources.