Welcome!

@CloudExpo Authors: William Schmarzo, Elizabeth White, Mauro Carniel, John Worthington, Pat Romanski

Related Topics: @CloudExpo, Containers Expo Blog, Cloud Security

@CloudExpo: Blog Post

Software-Defined Technologies: Transforming the Value Stream | @CloudExpo #SDN #NFV #Cloud

Software-defined technologies explicitly change the way functions and activities are organized and managed

Software-Defined Technologies: Transforming the Value Stream

Software-defined is a concept that refers to the ability to control some or all of the functions of a system using software. The concept is sometimes incorrectly characterized as a buzzword or marketing jargon, when in fact it has a clear meaning that needs to be understood by organizations looking to keep pace with change.

When technologies become software-defined, there are major systemic benefits for organizations that use them, including lower costs, higher quality products and services, and less risk.

At the same time, software-defined technologies require major organizational changes for incumbent enterprises to adopt and use effectively. This often involves expensive and risky transformation projects that reengineer the value stream to take advantage of decoupled components, reduced dependencies and new management capabilities.

Today we will look at the origins of the "software-defined" concept and how its application presents both opportunities and challenges to the enterprise.

The beginning: ‘Software-defined Radio'
The software-defined concept comes to us from the evolution of radio transmission technology. A traditional radio communications system uses physically connected components that can only be modified through physical intervention. The antenna connects to the amplifier, which connects to the modulator, and so on. Operators are locked into the specifications of the components, the order in which they are connected, and whatever controls they expose. It's an extremely inflexible technology and changes are best done by simply buying a new system.

As you can imagine, for businesses that operate large-scale radio deployments such as wireless telecom providers, technology decisions are hugely impactful. They can last decades and demand large upfront planning and capital costs. Keeping pace with change is extremely expensive and difficult.

Base Transceiver Station
In the mid-eighties however, researchers began to take specific components of the radio and make them digital, implementing functions like oscillators, mixers, amplifiers and filters by means of software on a computer. By emulating these functions in software, the system becomes adaptive and programmable, and can be configured according to the needs and requirements of the operator, rather than the specifications or the manufacturer.

In 1995, the term Software-Defined Radio (SDR) was coined to describe the commercialization of the first digital radio communication system, and this development changed the way these services and products can be delivered.

On the technical side, in becoming software-defined, many functional limitations are removed from radio systems. For example, by simply reprogramming the software, a device can have its frequency spectrum changed, allowing it to communicate with different devices and perform different functions. This has enabled a quick succession of technical advances that were previously the domain of theory and imagination, like ultra-wide band transmission, adaptive signaling, cognitive radio and the end of the "near-far" problem.

On the business side, the changes are equally profound, having a significant impact on the value stream of enterprises throughout the wireless and radio industry, and the industry itself. A wireless telecom provider employing software-defined radio can easily add new features to its network, adapt its systems to take advantage of new spectrum bands, or reconfigure itself when a new handset technology like LTE 4G becomes available. A telecom provider iable reconfigure its infrastructure by deploying updates to software rather than by buying new hardware can take advantage of huge operational savings while eliminating capital expenses.

SDR therefore provides significant strategic advantage to these businesses, introducing adaptability, modularity and agility to the organization where it was previously rigid and inflexible.

Taking advantage of SDR, however, is a long, transformational process, needing a lot of capital and a significant departure from the status quo. Not only does it require changing all infrastructure over to the new technology, but it also requires the business to think differently and reengineer the value chain to take advantage of the new capabilities.

Software-Defined Infrastructure
The IT industry has also been deeply impacted by the advent of software-defined technologies. The following examples have created industries and enabled a generation of evolved products and services:

  • Hypervisors - A hypervisor is an operating system that runs virtual machines, like VMWare ESXi or Microsoft Hyper-V. It runs directly on the physical machine, abstracting and distributing the hardware resources to any number of virtual machines. This has undoubtedly been one of the largest and most impactful advances in IT in the last 20 years, ushering in the era of point and click server deployment and changing the way we manage and deliver IT services.
  • Software-Defined Networking (SDN) - Traditionally, operating a network means managing lower level infrastructure that allows devices to connect, communicate with each other, and figure out where to send their packets. These switching devices - called "layer 2 devices" - each need to maintain their own state and configuration information, and make decisions about how to route packets based only on limited, locally available information. SDN abstracts layer 2 networking, and is the ‘secret sauce' behind cloud computing - a critical functionality for all public cloud services including AWS, Azure and OpenStack-based providers. It allows the service provider to centralize routing and switching, and provides the orchestration capability required for large-scale multi-tenancy, i.e. the ability to create and manage millions of logically isolated, secure networks.,
  • Network-function virtualization (NFV) - Building upon SDN, NFV allows services like load balancers, firewalls, IDS, accelerators, and CDNs to be deployed and configured quickly and easily. Without NFV, to operate infrastructure at scale you would need a lot capital investment and an experienced team of highly specialized network engineers. NFV makes it easy to deploy, secure and manage these functions without having to understand the complexities underneath the hood.

"Software-Defined" Defined

Having looked at where the concept came from and a few examples of modern software-defined technologies, I propose the following definition for what it means to be "software-defined":

Software-defined means some or all of the functions of a system can be managed and controlled through software.

Some key attributes of a software-defined technology:

1. The functions are abstracted
Software-definition strives to have stateless functions i.e. functions that do not maintain their configuration or state themselves. State and configuration information is maintained outside the function, i.e. in the software. By decoupling the state and configuration from the function and centralizing it, we gain adaptability, resilience, and the benefit of visibility at scale.

2. Software controls functionality
No direct operator or human intervention is required for the function to operate - functions are managed solely through software. Management and administration are therefore decoupled from the function. We gain the ability to automate processes and activities, and manage the system independently from functional limitations.

3. Functional components are modular
The software layer operates independently from any dependency on functional components. This means the functional components can be commoditized, modular and scalable. We can easily change or replace these components without disrupting the system.

Adoption through Transformation
On the face of it, software-defined technologies are better-faster-stronger, and companies that use them will have a competitive advantage over those that do not. They lead to lower costs, higher quality and less risk for the business. Smaller organizations building products and services that leverage these technologies can use them to disrupt incumbent enterprises.

For those enterprises, however, especially those locked in the middle of a legacy lifecycle, software-defined technologies present a significant challenge. Adoption requires rethinking the value stream and integrating with legacy systems. As stated in the book Lean Thinking,

"We are all born into a mental world of ‘functions' and ‘departments,' a commonsense conviction that activities ought to be grouped by type so they can be performed more efficiently and managed more easily" (p. 25)

Not only do software-defined technologies present a threat to the enterprise in the hands of startups, they also explicitly change the way functions and activities are organized and managed. Adoption demands rethinking how, where, when and by whom functions should operate in the value stream. This entails changing culture, reorganizing roles and team structures, and reengineering the value stream. This kind of change must be driven through risky and expensive and expensive transformation projects.

Traditional enterprises therefore face major challenges with developing competencies with new software-defined systems. Leaders in these organizations will need to be highly flexible and open to a paradigm shift in how they think about their work.

More Stories By John Rauser

John Rauser is the IT Manager at Tasktop Technologies, a global enterprise software company. He also serves as VP Operations at the board of the Project Management Institute - Canadian West Coast Chapter, providing leadership and expertise on technology issues. He has a passion for discussing the business impacts of technology and analyzing strategies for managing IT.

Comments (0)

Share your thoughts on this story.

Add your comment
You must be signed in to add a comment. Sign-in | Register

In accordance with our Comment Policy, we encourage comments that are on topic, relevant and to-the-point. We will remove comments that include profanity, personal attacks, racial slurs, threats of violence, or other inappropriate material that violates our Terms and Conditions, and will block users who make repeated violations. We ask all readers to expect diversity of opinion and to treat one another with dignity and respect.


@CloudExpo Stories
In his session at 21st Cloud Expo, Carl J. Levine, Senior Technical Evangelist for NS1, will objectively discuss how DNS is used to solve Digital Transformation challenges in large SaaS applications, CDNs, AdTech platforms, and other demanding use cases. Carl J. Levine is the Senior Technical Evangelist for NS1. A veteran of the Internet Infrastructure space, he has over a decade of experience with startups, networking protocols and Internet infrastructure, combined with the unique ability to it...
"MobiDev is a software development company and we do complex, custom software development for everybody from entrepreneurs to large enterprises," explained Alan Winters, U.S. Head of Business Development at MobiDev, in this SYS-CON.tv interview at 21st Cloud Expo, held Oct 31 – Nov 2, 2017, at the Santa Clara Convention Center in Santa Clara, CA.
Vulnerability management is vital for large companies that need to secure containers across thousands of hosts, but many struggle to understand how exposed they are when they discover a new high security vulnerability. In his session at 21st Cloud Expo, John Morello, CTO of Twistlock, addressed this pressing concern by introducing the concept of the “Vulnerability Risk Tree API,” which brings all the data together in a simple REST endpoint, allowing companies to easily grasp the severity of the ...
Agile has finally jumped the technology shark, expanding outside the software world. Enterprises are now increasingly adopting Agile practices across their organizations in order to successfully navigate the disruptive waters that threaten to drown them. In our quest for establishing change as a core competency in our organizations, this business-centric notion of Agile is an essential component of Agile Digital Transformation. In the years since the publication of the Agile Manifesto, the conn...
In his session at 21st Cloud Expo, James Henry, Co-CEO/CTO of Calgary Scientific Inc., introduced you to the challenges, solutions and benefits of training AI systems to solve visual problems with an emphasis on improving AIs with continuous training in the field. He explored applications in several industries and discussed technologies that allow the deployment of advanced visualization solutions to the cloud.
Enterprises are adopting Kubernetes to accelerate the development and the delivery of cloud-native applications. However, sharing a Kubernetes cluster between members of the same team can be challenging. And, sharing clusters across multiple teams is even harder. Kubernetes offers several constructs to help implement segmentation and isolation. However, these primitives can be complex to understand and apply. As a result, it’s becoming common for enterprises to end up with several clusters. Thi...
While some developers care passionately about how data centers and clouds are architected, for most, it is only the end result that matters. To the majority of companies, technology exists to solve a business problem, and only delivers value when it is solving that problem. 2017 brings the mainstream adoption of containers for production workloads. In his session at 21st Cloud Expo, Ben McCormack, VP of Operations at Evernote, discussed how data centers of the future will be managed, how the p...
"NetApp is known as a data management leader but we do a lot more than just data management on-prem with the data centers of our customers. We're also big in the hybrid cloud," explained Wes Talbert, Principal Architect at NetApp, in this SYS-CON.tv interview at 21st Cloud Expo, held Oct 31 – Nov 2, 2017, at the Santa Clara Convention Center in Santa Clara, CA.
"Space Monkey by Vivent Smart Home is a product that is a distributed cloud-based edge storage network. Vivent Smart Home, our parent company, is a smart home provider that places a lot of hard drives across homes in North America," explained JT Olds, Director of Engineering, and Brandon Crowfeather, Product Manager, at Vivint Smart Home, in this SYS-CON.tv interview at @ThingsExpo, held Oct 31 – Nov 2, 2017, at the Santa Clara Convention Center in Santa Clara, CA.
The question before companies today is not whether to become intelligent, it’s a question of how and how fast. The key is to adopt and deploy an intelligent application strategy while simultaneously preparing to scale that intelligence. In her session at 21st Cloud Expo, Sangeeta Chakraborty, Chief Customer Officer at Ayasdi, provided a tactical framework to become a truly intelligent enterprise, including how to identify the right applications for AI, how to build a Center of Excellence to oper...
"IBM is really all in on blockchain. We take a look at sort of the history of blockchain ledger technologies. It started out with bitcoin, Ethereum, and IBM evaluated these particular blockchain technologies and found they were anonymous and permissionless and that many companies were looking for permissioned blockchain," stated René Bostic, Technical VP of the IBM Cloud Unit in North America, in this SYS-CON.tv interview at 21st Cloud Expo, held Oct 31 – Nov 2, 2017, at the Santa Clara Conventi...
Coca-Cola’s Google powered digital signage system lays the groundwork for a more valuable connection between Coke and its customers. Digital signs pair software with high-resolution displays so that a message can be changed instantly based on what the operator wants to communicate or sell. In their Day 3 Keynote at 21st Cloud Expo, Greg Chambers, Global Group Director, Digital Innovation, Coca-Cola, and Vidya Nagarajan, a Senior Product Manager at Google, discussed how from store operations and ...
"Infoblox does DNS, DHCP and IP address management for not only enterprise networks but cloud networks as well. Customers are looking for a single platform that can extend not only in their private enterprise environment but private cloud, public cloud, tracking all the IP space and everything that is going on in that environment," explained Steve Salo, Principal Systems Engineer at Infoblox, in this SYS-CON.tv interview at 21st Cloud Expo, held Oct 31 – Nov 2, 2017, at the Santa Clara Conventio...
"We're focused on how to get some of the attributes that you would expect from an Amazon, Azure, Google, and doing that on-prem. We believe today that you can actually get those types of things done with certain architectures available in the market today," explained Steve Conner, VP of Sales at Cloudistics, in this SYS-CON.tv interview at 21st Cloud Expo, held Oct 31 – Nov 2, 2017, at the Santa Clara Convention Center in Santa Clara, CA.
Large industrial manufacturing organizations are adopting the agile principles of cloud software companies. The industrial manufacturing development process has not scaled over time. Now that design CAD teams are geographically distributed, centralizing their work is key. With large multi-gigabyte projects, outdated tools have stifled industrial team agility, time-to-market milestones, and impacted P&L stakeholders.
"ZeroStack is a startup in Silicon Valley. We're solving a very interesting problem around bringing public cloud convenience with private cloud control for enterprises and mid-size companies," explained Kamesh Pemmaraju, VP of Product Management at ZeroStack, in this SYS-CON.tv interview at 21st Cloud Expo, held Oct 31 – Nov 2, 2017, at the Santa Clara Convention Center in Santa Clara, CA.
"Codigm is based on the cloud and we are here to explore marketing opportunities in America. Our mission is to make an ecosystem of the SW environment that anyone can understand, learn, teach, and develop the SW on the cloud," explained Sung Tae Ryu, CEO of Codigm, in this SYS-CON.tv interview at 21st Cloud Expo, held Oct 31 – Nov 2, 2017, at the Santa Clara Convention Center in Santa Clara, CA.
Data scientists must access high-performance computing resources across a wide-area network. To achieve cloud-based HPC visualization, researchers must transfer datasets and visualization results efficiently. HPC clusters now compute GPU-accelerated visualization in the cloud cluster. To efficiently display results remotely, a high-performance, low-latency protocol transfers the display from the cluster to a remote desktop. Further, tools to easily mount remote datasets and efficiently transfer...
High-velocity engineering teams are applying not only continuous delivery processes, but also lessons in experimentation from established leaders like Amazon, Netflix, and Facebook. These companies have made experimentation a foundation for their release processes, allowing them to try out major feature releases and redesigns within smaller groups before making them broadly available. In his session at 21st Cloud Expo, Brian Lucas, Senior Staff Engineer at Optimizely, discussed how by using ne...
Gemini is Yahoo’s native and search advertising platform. To ensure the quality of a complex distributed system that spans multiple products and components and across various desktop websites and mobile app and web experiences – both Yahoo owned and operated and third-party syndication (supply), with complex interaction with more than a billion users and numerous advertisers globally (demand) – it becomes imperative to automate a set of end-to-end tests 24x7 to detect bugs and regression. In th...