|By Greg Ness||
|April 13, 2009 05:45 AM EDT||
Whether you’re a small business considering cloud services or an enterprise contemplating public or private cloud services, it pays to understand some of the technical challenges and players likely to have a significant impact on the availability, security and costs of those services. Cloud computing is a game changer, and it may also pay to know who could win or lose as IT services are decoupled from specialized hardware in specific locations.
Don’t let the endless list of companies proclaiming cloud leadership confuse you that the world has already embraced cloud; there is a vast difference between using cloud services to deliver software as a service and delivering cloud IT services in a multi-tenant public environment. There is also a sizable gap between cloud announcements, cloud revenue and enterprise-ready cloud services.
Vendors who best address the gap between true cloud requirements and today’s whirlwind of proclamations will be tomorrow’s winners as computing processes and storage requirements shift from endpoints and custom hardware to networks and netbooks. Investors who understand the difference between proclamations and critical technologies will make better decisions. Networking pros who understand the ramifications of this shift will have more influence over their career development.
I’ve been in the networking industry for most of the last nine years, so my perspective is understandably network-centric. My list of critical technical challenges focus on networking, because I think that this area hasn’t been adequately discussed in the haze of vendor cloud positioning exercises; and I think networks will be more strategic to the cloud than they are to the LAN or WAN.
There are at least three network-centric technology challenges when it comes to cloud computing: 1) network automation and management; 2) capacity; and 3) security.
The Case for Network Automation
Virtualization set the stage for cloud computing by decoupling applications and operating systems from hardware. Some even suggest that virtualization software is an operating system. That decoupling combined with VMotion enables considerable savings in how servers are utilized. Racks of specialized servers kept on 24/7 in case they’re needed can be converted into smaller racks of more powerful blade servers distributed around the world to exploit off peak power turned on as they’re needed.
The larger the pool of blade servers that can be utilized as needed the higher the energy savings. Check out this product efficiency calculator at the Cisco data center blog. Today’s network infrastructure (infrastructure 1.0) contains millions of specialized servers connected by complex, growing networks wasting huge amounts of energy, from electricity to the human capital required for changes, configuration and a host of mundane, yet specialized tasks.
The Increasingly Unbearable Human Capital Factor
These tasks engage ranks of network administrators manually managing everything from spreadsheets of IP addresses (otherwise known as IP address management or IPAM) to DNS/DHCP, RADIUS, NTP and TFTP. You can call these services core network services, or one of the last bastions of manual labor and expense in IT. Manual labor gets increasingly expensive (even on a per IP address basis) as networks grow and outage risks increase with every new device and network added.
Committees form as networks grow in an effort to avoid the risk of outage and exercise better control over the availability, security and scalability of the network, not to mention the performance of applications. Yet these committees add extra time and resources and expense to every network change, increasing expenses further in an effort to reduce risk. This “necessary bureaucracy” required (at least with manually managed networks) severely constrains the ability of an organization to embrace the flexibility and consolidation enabled by virtualization and cloud.
While network automation (or the automation of core network services) can deliver sizable capital and operating expense savings it also helps companies position themselves for the coming era of virtualization and cloud computing. Yes its true: some of the most mundane, even boring tasks required to keep a network available will become even more strategic to the next big era of computing.
The Case for more Network Capacity
I’m on a panel in late May at the Strategic News Service Future in Review conference on dynamic infrastructure (infrastructure 2.0), along with Richard Kagan from Infoblox,
You can watch Cisco’s Gourlay (via YouTube) talk about the sheer load, operating and cash requirements (before movement is added) of a data center during a recent Infrastructure 2.0 event. Here on YouTube, about 5 minutes in you can watch Gourlay talk about the new network requirements of virtualization and cloud services and about the load requirements 8 minutes in here on YouTube.
The business case for this level of mobility is especially powerful for the larger enterprise and service provider. And I think it is this business case that will drive the next round of investment in network infrastructure. Cisco’s recent Unified Computing announcement, and recent IBM/Juniper announcements and IBM/Sun discussions all point to the synergy between networks, applications, endpoints and virtualized services.
I’m still waiting for a networking vendor to announce its own branded OEM netbook, similar to how Cisco entered VoIP years ago with Cisco branded OEM phones.
The blade server loaded with virtualization software is called a hypervisor. One of the most important network implications of the hypervisor is that the network actually terminates inside the blade server. This could explain to those preoccupied with the blade server portion of Cisco’s recent announcement how strategic the hypervisor is to the network.
The (Infrastructure 2.0) network will ultimately be built on meshes of ever more powerful blade servers connected by ever more powerful networks capable of ever more powerful load transport managed by new generations of specialized appliances delivering unprecedented levels of automation and management. Specialization will shift from the hardware in the core of the network (starting with blade servers) to the hardware automating and managing the network.
Strategic Specialization Driving Unprecedented Automation and Commoditization
The increasing levels of movement and load and the business case enabled by virtualization and cloud computing will make management and automation strategic to the cloud. That strategic payoff will justify and support specialization while commodity functions will increasingly shift to software on commoditized blades.
Those who miss the strategic payoff of network automation will learn a painful lesson: adding higher velocities of change to a manually administered network drives up expenses and erodes the business case for virtualization and cloud computing. Virtualization cannot thrive on a network run by checklists and committees You can read a recent blog by Cisco's James Urquhart addressing the critical role that core network service automation plays in the evolution to Infrastructure 2.0.
As commoditization spreads through populations of servers and switches and routers, intelligence and automation will shift from spreadsheets and manual labor intensive freeware to a new generation of specialized, powerful appliances specifically designed to unleash the power of automation through ever larger and more geographically dispersed Grids. Those vendors designed in, perhaps through partnerships and/or preloaded software will have strategic advantages over those still caught up in the monetization of complexity and control that played a key part in the growth of the network hardware appliance industry.
We saw the same effect in the application delivery space as load balancers were commoditized and intelligence and specialization were designed into new layer 4-7 application front ends. New application delivery demands forced new functionality into specialized network appliances and established a booming industry made up by the likes of F5 Networks, Cisco and others. New levels of load and mobility will require more network capacity and more automation and management.
The Hazy Cloud Security Story
When virtualization entered the data center it indirectly drove a meme explosion around virtualization security. Those driving virtualization into production were in effect colliding two worlds of IT not used to working together: devtest (operations) and network security. Of all of the virtualization players, VMware got this first and created an ecosystem and making an acquisition that enabled the first serious security offering from a virtualization vendor.
As virtualization is a critical enabler of cloud computing, enabling the dynamic movement of processing power from one location to another (the decoupling of application from hardware), the virtualization security issues only get more complex in a cloud environment. For an entertaining deep dive try out Chris Hoff’s "The Frog who would be King" PowerPoint deck. Or try his blog on PCI compliance in the cloud.
In essence, the very dynamic mobility of a cloud computing environment wreaks havoc on static network security infrastructure. The same old attacks suddenly get new cloud attack vectors and new, ever larger hordes of available treasure. While Google and Amazon often deny cloud security issues by issuing blanket statements, their cloud efforts are clearly focused on businesses and consumers less concerned about security risks and compliance. Other cloud providers may have a similar approach.
That isn’t to say that they haven’t solved critical security issues, just that they haven’t been very open in discussing them. For those of us all to aware of the virtualization security surprise and its impact on VLAN spaghetti (the anti-cloud), cloud security proclamations only deliver a hazy picture of an image that needs to be very clear to enterprise IT execs.
The Triple Play
Looking forward I think the three dynamics of network automation, capacity and security will create new opportunities for vendors and network pros who understand the strategic shift enabled by cloud and the technological barriers or issues. More and more it appears that IT services will evolve and force new partnerships and potentials and shift specialization into new areas of IT that enable greater automation and mobility. That will enable new security and capacity capabilities.
As Cisco, Microsoft, VMware, Juniper, IBM and Sun place their bets in various forms of partnership or collaboration it seems clear that whoever offers the most dynamic infrastructure with the most effective security and greatest capacity will have a strategic advantage selling to large enterprises and service providers. That advantage could put incredible pressures on those who have yet to articulate and deliver on the new vision.
The winners’ main competitors may end up being Google and Amazon instead of the usual assortment of category competitors; as those categories may become extinct.
We all know the latest numbers: Gartner, Inc. forecasts that 6.4 billion connected things will be in use worldwide in 2016, up 30 percent from last year, and will reach 20.8 billion by 2020. We're rapidly approaching a data production of 40 zettabytes a day – more than we can every physically store, and exabytes and yottabytes are just around the corner. For many that’s a good sign, as data has been proven to equal money – IF it’s ingested, integrated, and analyzed fast enough. Without real-ti...
Jul. 23, 2016 07:00 PM EDT Reads: 799
"We view the cloud not really as a specific technology but as a way of doing business and that way of doing business is transforming the way software, infrastructure and services are being delivered to business," explained Matthew Rosen, CEO and Director at Fusion, in this SYS-CON.tv interview at 18th Cloud Expo, held June 7-9, 2016, at the Javits Center in New York City, NY.
Jul. 23, 2016 07:00 PM EDT Reads: 1,408
"Software-defined storage is a big problem in this industry because so many people have different definitions as they see fit to use it," stated Peter McCallum, VP of Datacenter Solutions at FalconStor Software, in this SYS-CON.tv interview at 18th Cloud Expo, held June 7-9, 2016, at the Javits Center in New York City, NY.
Jul. 23, 2016 06:30 PM EDT Reads: 1,328
With 15% of enterprises adopting a hybrid IT strategy, you need to set a plan to integrate hybrid cloud throughout your infrastructure. In his session at 18th Cloud Expo, Steven Dreher, Director of Solutions Architecture at Green House Data, discussed how to plan for shifting resource requirements, overcome challenges, and implement hybrid IT alongside your existing data center assets. Highlights included anticipating workload, cost and resource calculations, integrating services on both sides...
Jul. 23, 2016 06:00 PM EDT Reads: 1,855
In his session at @DevOpsSummit at 19th Cloud Expo, Yoseph Reuveni, Director of Software Engineering at Jet.com, will discuss Jet.com's journey into containerizing Microsoft-based technologies like C# and F# into Docker. He will talk about lessons learned and challenges faced, the Mono framework tryout and how they deployed everything into Azure cloud. Yoseph Reuveni is a technology leader with unique experience developing and running high throughput (over 1M tps) distributed systems with extre...
Jul. 23, 2016 05:30 PM EDT Reads: 1,963
"We are a well-established player in the application life cycle management market and we also have a very strong version control product," stated Flint Brenton, CEO of CollabNet,, in this SYS-CON.tv interview at 18th Cloud Expo, held June 7-9, 2016, at the Javits Center in New York City, NY.
Jul. 23, 2016 05:30 PM EDT Reads: 1,703
"We provide DevOps solutions. We also partner with some key players in the DevOps space and we use the technology that we partner with to engineer custom solutions for different organizations," stated Himanshu Chhetri, CTO of Addteq, in this SYS-CON.tv interview at DevOps at 18th Cloud Expo, held June 7-9, 2016, at the Javits Center in New York City, NY.
Jul. 23, 2016 05:30 PM EDT Reads: 1,525
Actian Corporation has announced the latest version of the Actian Vector in Hadoop (VectorH) database, generally available at the end of July. VectorH is based on the same query engine that powers Actian Vector, which recently doubled the TPC-H benchmark record for non-clustered systems at the 3000GB scale factor (see tpc.org/3323). The ability to easily ingest information from different data sources and rapidly develop queries to make better business decisions is becoming increasingly importan...
Jul. 23, 2016 05:15 PM EDT Reads: 645
"Operations is sort of the maturation of cloud utilization and the move to the cloud," explained Steve Anderson, Product Manager for BMC’s Cloud Lifecycle Management, in this SYS-CON.tv interview at 18th Cloud Expo, held June 7-9, 2016, at the Javits Center in New York City, NY.
Jul. 23, 2016 05:00 PM EDT Reads: 1,779
Enterprise networks are complex. Moreover, they were designed and deployed to meet a specific set of business requirements at a specific point in time. But, the adoption of cloud services, new business applications and intensifying security policies, among other factors, require IT organizations to continuously deploy configuration changes. Therefore, enterprises are looking for better ways to automate the management of their networks while still leveraging existing capabilities, optimizing perf...
Jul. 23, 2016 04:45 PM EDT Reads: 892
Security, data privacy, reliability and regulatory compliance are critical factors when evaluating whether to move business applications from in-house client hosted environments to a cloud platform. In her session at 18th Cloud Expo, Vandana Viswanathan, Associate Director at Cognizant, In this session, will provide an orientation to the five stages required to implement a cloud hosted solution validation strategy.
Jul. 23, 2016 04:30 PM EDT Reads: 668
Unless your company can spend a lot of money on new technology, re-engineering your environment and hiring a comprehensive cybersecurity team, you will most likely move to the cloud or seek external service partnerships. In his session at 18th Cloud Expo, Darren Guccione, CEO of Keeper Security, revealed what you need to know when it comes to encryption in the cloud.
Jul. 23, 2016 04:00 PM EDT Reads: 2,295
The cloud competition for database hosts is fierce. How do you evaluate a cloud provider for your database platform? In his session at 18th Cloud Expo, Chris Presley, a Solutions Architect at Pythian, gave users a checklist of considerations when choosing a provider. Chris Presley is a Solutions Architect at Pythian. He loves order – making him a premier Microsoft SQL Server expert. Not only has he programmed and administered SQL Server, but he has also shared his expertise and passion with b...
Jul. 23, 2016 04:00 PM EDT Reads: 1,804
We're entering the post-smartphone era, where wearable gadgets from watches and fitness bands to glasses and health aids will power the next technological revolution. With mass adoption of wearable devices comes a new data ecosystem that must be protected. Wearables open new pathways that facilitate the tracking, sharing and storing of consumers’ personal health, location and daily activity data. Consumers have some idea of the data these devices capture, but most don’t realize how revealing and...
Jul. 23, 2016 04:00 PM EDT Reads: 1,964
What are the successful IoT innovations from emerging markets? What are the unique challenges and opportunities from these markets? How did the constraints in connectivity among others lead to groundbreaking insights? In her session at @ThingsExpo, Carmen Feliciano, a Principal at AMDG, will answer all these questions and share how you can apply IoT best practices and frameworks from the emerging markets to your own business.
Jul. 23, 2016 03:45 PM EDT Reads: 1,493
Basho Technologies has announced the latest release of Basho Riak TS, version 1.3. Riak TS is an enterprise-grade NoSQL database optimized for Internet of Things (IoT). The open source version enables developers to download the software for free and use it in production as well as make contributions to the code and develop applications around Riak TS. Enhancements to Riak TS make it quick, easy and cost-effective to spin up an instance to test new ideas and build IoT applications. In addition to...
Jul. 23, 2016 03:30 PM EDT Reads: 1,831
You think you know what’s in your data. But do you? Most organizations are now aware of the business intelligence represented by their data. Data science stands to take this to a level you never thought of – literally. The techniques of data science, when used with the capabilities of Big Data technologies, can make connections you had not yet imagined, helping you discover new insights and ask new questions of your data. In his session at @ThingsExpo, Sarbjit Sarkaria, data science team lead ...
Jul. 23, 2016 03:15 PM EDT Reads: 797
Extracting business value from Internet of Things (IoT) data doesn’t happen overnight. There are several requirements that must be satisfied, including IoT device enablement, data analysis, real-time detection of complex events and automated orchestration of actions. Unfortunately, too many companies fall short in achieving their business goals by implementing incomplete solutions or not focusing on tangible use cases. In his general session at @ThingsExpo, Dave McCarthy, Director of Products...
Jul. 23, 2016 03:00 PM EDT Reads: 1,647
Many private cloud projects were built to deliver self-service access to development and test resources. While those clouds delivered faster access to resources, they lacked visibility, control and security needed for production deployments. In their session at 18th Cloud Expo, Steve Anderson, Product Manager at BMC Software, and Rick Lefort, Principal Technical Marketing Consultant at BMC Software, discussed how a cloud designed for production operations not only helps accelerate developer in...
Jul. 23, 2016 03:00 PM EDT Reads: 1,182
Ask someone to architect an Internet of Things (IoT) solution and you are guaranteed to see a reference to the cloud. This would lead you to believe that IoT requires the cloud to exist. However, there are many IoT use cases where the cloud is not feasible or desirable. In his session at @ThingsExpo, Dave McCarthy, Director of Products at Bsquare Corporation, will discuss the strategies that exist to extend intelligence directly to IoT devices and sensors, freeing them from the constraints of ...
Jul. 23, 2016 02:30 PM EDT Reads: 1,699