Click here to close now.

Welcome!

Cloud Expo Authors: Liz McMillan, Elizabeth White, AppDynamics Blog, Kelly Murphy, Pat Romanski

Related Topics: Cloud Expo, Java, Microservices Journal, Virtualization, AJAX & REA, Apache, Security, Big Data Journal

Cloud Expo: Blog Feed Post

SDN: Concrete or Concept?

Is SDN a concept, or a concrete architectural construct? Does it really matter?

Now, if we look at the benefits we can attempt to infer what the problems SDN is trying to solve:

BENEFIT

PROBLEM

Programmability Network components today, particularly hardware-based components, run specific feature sets that can only be modified by the vendor, which happens on a 12-18 month schedule, security and hot-fixes not withstanding. New features and functions can only be added by the vendor based on their prioritization, not the customer.
Automation Manual configuration of network components is time consuming, costly, and introduces a higher risk of human error that can result in outages, poor performance, or security risks.
Network control The network today doesn't adapt rapidly to changing conditions or events. While some protocols simulate such adaptability, these protocols can't autonomously route around outages or failures or modify existing policies easily.

These are certainly problems for IT organizations of a variety of sizes and composition. The question then is, how does SDN uniquely solve those problems?

The answer is that as a concrete solution (i.e. components, software, and architectures)  it does not uniquely solve the problem. As a concept, however, it does.

Someone's no doubt quite upset at the moment at that statement. Let's explain it before someone's head explodes.

CONCEPT versus CONCRETE

The concept of separating data and control plane enables programmability. Without that separation, we have what we have today – static, inflexible networking components. But the concept of separating data and control planes isn't unique to solutions labeled specifically SDN. ADN is a good example of this (you saw that coming, didn't you?)

A network component can – and this may surprise some people – internally decouple control and data planes. Yeah, I know, right? And doing so enables a platform that looks a whole lot like SDN diagrams, doesn't it – with plug-ins and programmability. This occurs in full-proxy architectures when there exist dual stacks – one on the client side, one on the server side. Where traffic transitions from one stack to the other exists an opportunity to inspect, to manipulate, to modify, the traffic. Because the architecture requires acting as an endpoint to clients (and conversely as the point of origin for the server side), protocols can even be implemented in this "no man's land" between client and server. That enables protocol transitioning, such as enabling SPDY on the outside while still speaking HTTP on the inside or IPv4 to servers while supporting IPv6 on the client (and vice-versa).

Where the separation occurs is not necessarily as important as the fact that it exists – unless you're focused on concrete, SDN-labeled solutions as being the only solutions that can provide the flexibility that programmability offers.

Automation occurs by exposing the management plane through an API (or implementing a specific API, such as OpenFlow) such that operational tasks and configuration can be achieved through tools instead of time.

Between automation and programmability, you realize network control.

Now, this is not SDN, at least not in terms of protocol support and concrete architecture. But it is software-defined, and it is networking, so does it count?

I guess it depends. ADN has always approached layers 4-7 with an eye toward extensibility, programmability and control that enables agility in the network. We didn't call it SDN and I don't see the industry deciding to "SDN-wash" existing ADN solutions as SDN just because a new term came along and became the TLA du jour.

What I do see is that ADN and specifically full-proxy based ADC (application delivery controllers) already offer the same benefits using the same concepts as SDN. Consider again the core characteristics of SDN:

1. Control and data planes are decoupled

2. Intelligence and state are logically centralized

3. Underlying network infrastructure abstracted from applications

All of these characteristics are present in an ADN. The ability to leverage network-side scripting on the control plane side of the equation enables extensibility, rapid innovation, and ability to adapt to support new protocols, new applications, new business requirements – all without involving the vendor. Which is exactly one of the benefits cited for SDN solutions and specifically OpenFlow-enabled architectures.

So the question really is, does it matter if a solution to the problem of "agility in the network" is a concrete or conceptual SDN solution if it ultimately solves the same set of problems?

Read the original blog entry...

More Stories By Lori MacVittie

Lori MacVittie is responsible for education and evangelism of application services available across F5’s entire product suite. Her role includes authorship of technical materials and participation in a number of community-based forums and industry standards organizations, among other efforts. MacVittie has extensive programming experience as an application architect, as well as network and systems development and administration expertise. Prior to joining F5, MacVittie was an award-winning Senior Technology Editor at Network Computing Magazine, where she conducted product research and evaluation focused on integration with application and network architectures, and authored articles on a variety of topics aimed at IT professionals. Her most recent area of focus included SOA-related products and architectures. She holds a B.S. in Information and Computing Science from the University of Wisconsin at Green Bay, and an M.S. in Computer Science from Nova Southeastern University.

@CloudExpo Stories
Due of the rise of Hadoop, many enterprises are now deploying their first small clusters of 10 to 20 servers. At this small scale, the complexity of operating the cluster looks and feels like general data center servers. It is not until the clusters scale, as they inevitably do, when the pain caused by the exponential complexity becomes apparent. We've seen this problem occur time and time again. In his session at Big Data Expo, Greg Bruno, Vice President of Engineering and co-founder of StackI...
Once the decision has been made to move part or all of a workload to the cloud, a methodology for selecting that workload needs to be established. How do you move to the cloud? What does the discovery, assessment and planning look like? What workloads make sense? Which cloud model makes sense for each workload? What are the considerations for how to select the right cloud model? And how does that fit in with the overall IT transformation?
The recent trends like cloud computing, social, mobile and Internet of Things are forcing enterprises to modernize in order to compete in the competitive globalized markets. However, enterprises are approaching newer technologies with a more silo-ed way, gaining only sub optimal benefits. The Modern Enterprise model is presented as a newer way to think of enterprise IT, which takes a more holistic approach to embracing modern technologies.
There is little doubt that Big Data solutions will have an increasing role in the Enterprise IT mainstream over time. 8th International Big Data Expo, co-located with 17th International Cloud Expo - to be held November 3-5, 2015, at the Santa Clara Convention Center in Santa Clara, CA - has announced its Call for Papers is open. As advanced data storage, access and analytics technologies aimed at handling high-volume and/or fast moving data all move center stage, aided by the cloud computing bo...
Every day we read jaw-dropping stats on the explosion of data. We allocate significant resources to harness and better understand it. We build businesses around it. But we’ve only just begun. For big payoffs in Big Data, CIOs are turning to cognitive computing. Cognitive computing’s ability to securely extract insights, understand natural language, and get smarter each time it’s used is the next, logical step for Big Data.
Enterprises are fast realizing the importance of integrating SaaS/Cloud applications, API and on-premises data and processes, to unleash hidden value. This webinar explores how managers can use a Microservice-centric approach to aggressively tackle the unexpected new integration challenges posed by proliferation of cloud, mobile, social and big data projects. Industry analyst and SOA expert Jason Bloomberg will strip away the hype from microservices, and clearly identify their advantages and d...
There's no doubt that the Internet of Things is driving the next wave of innovation. Google has spent billions over the past few months vacuuming up companies that specialize in smart appliances and machine learning. Already, Philips light bulbs, Audi automobiles, and Samsung washers and dryers can communicate with and be controlled from mobile devices. To take advantage of the opportunities the Internet of Things brings to your business, you'll want to start preparing now.
In a world of ever-accelerating business cycles and fast-changing client expectations, the cloud increasingly serves as a growth engine and a path to new business models. Dynamic clouds enable businesses to continuously reinvent themselves, adapting their business processes, their service and software delivery and their operations to achieve speed-to-market and quick response to customer feedback. As the cloud evolves, the industry has multiple competing cloud technologies, offering on-premises ...
The 5th International DevOps Summit, co-located with 17th International Cloud Expo – being held November 3-5, 2015, at the Santa Clara Convention Center in Santa Clara, CA – announces that its Call for Papers is open. Born out of proven success in agile development, cloud computing, and process automation, DevOps is a macro trend you cannot afford to miss. From showcase success stories from early adopters and web-scale businesses, DevOps is expanding to organizations of all sizes, including the...
The OpenStack cloud operating system includes Trove, a database abstraction layer. Rather than applications connecting directly to a specific type of database, they connect to Trove, which in turn connects to one or more specific databases. One target database is Postgres Plus Cloud Database, which includes its own RESTful API. Trove was originally developed around MySQL, whose interfaces are significantly less complicated than those of the Postgres cloud database. In his session at 16th Cloud...
Over the years, a variety of methodologies have emerged in order to overcome the challenges related to project constraints. The successful use of each methodology seems highly context-dependent. However, communication seems to be the common denominator of the many challenges that project management methodologies intend to resolve. In this respect, Information and Communication Technologies (ICTs) can be viewed as powerful tools for managing projects. Few research papers have focused on the way...
As the world moves from DevOps to NoOps, application deployment to the cloud ought to become a lot simpler. However, applications have been architected with a much tighter coupling than it needs to be which makes deployment in different environments and migration between them harder. The microservices architecture, which is the basis of many new age distributed systems such as OpenStack, Netflix and so on is at the heart of CloudFoundry – a complete developer-oriented Platform as a Service (PaaS...
With major technology companies and startups seriously embracing IoT strategies, now is the perfect time to attend @ThingsExpo in Silicon Valley. Learn what is going on, contribute to the discussions, and ensure that your enterprise is as "IoT-Ready" as it can be! Internet of @ThingsExpo, taking place Nov 3-5, 2015, at the Santa Clara Convention Center in Santa Clara, CA, is co-located with 17th Cloud Expo and will feature technical sessions from a rock star conference faculty and the leading in...
SAP is delivering break-through innovation combined with fantastic user experience powered by the market-leading in-memory technology, SAP HANA. In his General Session at 15th Cloud Expo, Thorsten Leiduck, VP ISVs & Digital Commerce, SAP, discussed how SAP and partners provide cloud and hybrid cloud solutions as well as real-time Big Data offerings that help companies of all sizes and industries run better. SAP launched an application challenge to award the most innovative SAP HANA and SAP HANA...
The widespread success of cloud computing is driving the DevOps revolution in enterprise IT. Now as never before, development teams must communicate and collaborate in a dynamic, 24/7/365 environment. There is no time to wait for long development cycles that produce software that is obsolete at launch. DevOps may be disruptive, but it is essential. The DevOps Summit at Cloud Expo – to be held June 3-5, 2015, at the Javits Center in New York City – will expand the DevOps community, enable a wide...
P2P RTC will impact the landscape of communications, shifting from traditional telephony style communications models to OTT (Over-The-Top) cloud assisted & PaaS (Platform as a Service) communication services. The P2P shift will impact many areas of our lives, from mobile communication, human interactive web services, RTC and telephony infrastructure, user federation, security and privacy implications, business costs, and scalability. In his session at @ThingsExpo, Robin Raymond, Chief Architect...
There is no question that the cloud is where businesses want to host data. Until recently hypervisor virtualization was the most widely used method in cloud computing. Recently virtual containers have been gaining in popularity, and for good reason. In the debate between virtual machines and containers, the latter have been seen as the new kid on the block – and like other emerging technology have had some initial shortcomings. However, the container space has evolved drastically since coming on...
Explosive growth in connected devices. Enormous amounts of data for collection and analysis. Critical use of data for split-second decision making and actionable information. All three are factors in making the Internet of Things a reality. Yet, any one factor would have an IT organization pondering its infrastructure strategy. How should your organization enhance its IT framework to enable an Internet of Things implementation? In his session at Internet of @ThingsExpo, James Kirkland, Chief Ar...
The 17th International Cloud Expo has announced that its Call for Papers is open. 17th International Cloud Expo, to be held November 3-5, 2015, at the Santa Clara Convention Center in Santa Clara, CA, brings together Cloud Computing, APM, APIs, Microservices, Security, Big Data, Internet of Things, DevOps and WebRTC to one location. With cloud computing driving a higher percentage of enterprise IT budgets every year, it becomes increasingly important to plant your flag in this fast-expanding bu...
Cloud Expo, Inc. has announced today that Andi Mann returns to DevOps Summit 2015 as Conference Chair. The 4th International DevOps Summit will take place on June 9-11, 2015, at the Javits Center in New York City. "DevOps is set to be one of the most profound disruptions to hit IT in decades," said Andi Mann. "It is a natural extension of cloud computing, and I have seen both firsthand and in independent research the fantastic results DevOps delivers. So I am excited to help the great team at ...