Welcome!

@CloudExpo Authors: Pat Romanski, Liz McMillan, Yeshim Deniz, Elizabeth White, Zakia Bouachraoui

Related Topics: Containers Expo Blog, Microservices Expo, Agile Computing

Containers Expo Blog: Blog Feed Post

Infrastructure 2.0: Squishy Name for a Squishy Concept

It remains, as James Urquhart put it recently, a “squishy term”

There’s been increasing interest in Infrastructure 2.0 of late that’s encouraging to those of us who’ve been, well, pushing it uphill against the focus on cloud computing and virtualization for quite some time now.

What’s been the most frustrating about bringing this concept to awareness has been that cloud computing is one of the most tangible examples of both what infrastructure 2.0 is and what it can do and virtualization is certainly one of the larger technological drivers of Infrastructure 2.0 capable solutions today. So despite the frustration associated with cloud computing and virtualization stealing the stage, as it were, the spotlight is certainly helping to bring the issues which Infrastructure 2.0 is attempting to address into the fore. As it gains traction, one of the first challenges that must be addressed is to define what it is we mean when we say “Infrastructure 2.0.”

Like Web 2.0 – go ahead and try to define it simply – Infrastructure 2.0 remains, as James Urquhart put it recently, a “squishy term.”

James Urquhart in “Understanding Infrastructure 2.0”:

blockquote Right now, Infrastructure 2.0 is one of those "squishy" terms that can potentially incorporate a lot of different network automation characteristics. As is hinted at in the introduction to Ness' interview, there is a working group of network luminaries trying to sort out the details and propose an architectural framework, but we are still very early in the game. [link to referenced interview added]

What complicates Infrastructure 2.0 is that not only is the term “squishy” but so is the very concept. After all, Infrastructure 2.0 is mostly about collaboration, about integration, about intelligence. These are not off the shelf “solutions” but rather enabling technologies that are designed to drive the flexibility and agility of enterprise networks forward in a such as way as to alleviate the pain points associated with the brittle, fragile network architectures of the past.

Greg Ness summed it the concept, at least, very well more than a year ago in “The beginning of the end of static infrastructure” when he said, “The issue comes contextdown to static infrastructure incapable of keeping up with all of the new IP addresses and devices and initiatives and movement/change already taking place in large enterprises” and then noted that “the notion of application, endpoint and network intelligence thus far has been hamstrung by the lack of dynamic connectivity, or connectivity intelligence.”

What Greg noticed is missing is context, and perhaps even more importantly the ability to share that context across the entire infrastructure.  I could, and have, gone on and on and on about this subject so for now I’ll just stop and offer up a few links to some of the insightful posts that shed more light on Infrastructure 2.0 – its drivers, its requirements, its breadth of applicability, and its goals - to date:

James believes "Infrastructure 2.0" will “evolve into a body of standards that will have the same impact as BGP or DNS” and I share that belief. The trick is going to be in developing standards that allow for the “squishiness” that is required to remain flexible and adaptable across myriad architectures and environments while being able to standardize how that happens.

Read the original blog entry...

More Stories By Lori MacVittie

Lori MacVittie is responsible for education and evangelism of application services available across F5’s entire product suite. Her role includes authorship of technical materials and participation in a number of community-based forums and industry standards organizations, among other efforts. MacVittie has extensive programming experience as an application architect, as well as network and systems development and administration expertise. Prior to joining F5, MacVittie was an award-winning Senior Technology Editor at Network Computing Magazine, where she conducted product research and evaluation focused on integration with application and network architectures, and authored articles on a variety of topics aimed at IT professionals. Her most recent area of focus included SOA-related products and architectures. She holds a B.S. in Information and Computing Science from the University of Wisconsin at Green Bay, and an M.S. in Computer Science from Nova Southeastern University.

Comments (0)

Share your thoughts on this story.

Add your comment
You must be signed in to add a comment. Sign-in | Register

In accordance with our Comment Policy, we encourage comments that are on topic, relevant and to-the-point. We will remove comments that include profanity, personal attacks, racial slurs, threats of violence, or other inappropriate material that violates our Terms and Conditions, and will block users who make repeated violations. We ask all readers to expect diversity of opinion and to treat one another with dignity and respect.


CloudEXPO Stories
Digital Transformation: Preparing Cloud & IoT Security for the Age of Artificial Intelligence. As automation and artificial intelligence (AI) power solution development and delivery, many businesses need to build backend cloud capabilities. Well-poised organizations, marketing smart devices with AI and BlockChain capabilities prepare to refine compliance and regulatory capabilities in 2018. Volumes of health, financial, technical and privacy data, along with tightening compliance requirements by FTC, CUI/DFARS, EU-GDPR and the underlying National Cybersecurity Framework suggest the need for a ground-up re-thinking of security strategies and compliance actions. This session offers actionable advice based on case studies to demonstrate the impact of security and privacy attributes for the cloud-backed IoT and AI ecosystem.
Daniel Jones is CTO of EngineerBetter, helping enterprises deliver value faster. Previously he was an IT consultant, indie video games developer, head of web development in the finance sector, and an award-winning martial artist. Continuous Delivery makes it possible to exploit findings of cognitive psychology and neuroscience to increase the productivity and happiness of our teams.
To Really Work for Enterprises, MultiCloud Adoption Requires Far Better and Inclusive Cloud Monitoring and Cost Management … But How? Overwhelmingly, even as enterprises have adopted cloud computing and are expanding to multi-cloud computing, IT leaders remain concerned about how to monitor, manage and control costs across hybrid and multi-cloud deployments. It’s clear that traditional IT monitoring and management approaches, designed after all for on-premises data centers, are falling short in this new hybrid and dynamic environment.
"Calligo is a cloud service provider with data privacy at the heart of what we do. We are a typical Infrastructure as a Service cloud provider but it's been designed around data privacy," explained Julian Box, CEO and co-founder of Calligo, in this SYS-CON.tv interview at 21st Cloud Expo, held Oct 31 – Nov 2, 2017, at the Santa Clara Convention Center in Santa Clara, CA.
Poor data quality and analytics drive down business value. In fact, Gartner estimated that the average financial impact of poor data quality on organizations is $9.7 million per year. But bad data is much more than a cost center. By eroding trust in information, analytics and the business decisions based on these, it is a serious impediment to digital transformation.