|By Dan Joe Barry||
|December 10, 2009 04:45 AM EST||
Cloud Computing on Ulitzer
We are standing on the threshold of a new transition in information technology and communications; a radical departure from current practice that promises to bring us new levels of efficiency at a vastly reduced cost. Cloud computing is full of potential, bursting with opportunity and within our grasp.
But, remember, that clouds always appear to be within our grasp and bursting clouds promise only one thing: rain!
As with all radical transitions, it takes time for the various pieces to fall into place. Some of them are already in place; some of them have yet to be considered. In this article, we will take a look at both and try to gauge where we are today and what work still remains. In addition, we will try to understand what this means to the various stakeholders involved.
So what is the cloud and who are the stakeholders involved. There are many definitions available, but in simple terms, cloud computing involves providing an information technology service that is accessed remotely. This access can be over a public or private infrastructure, but for our purposes, it is probably useful to consider the Internet as a reference delivery infrastructure.
With this in mind, a simple cloud model would include the following stakeholders:
- The cloud service provider
- The cloud connectivity provider
- The Internet
- The user connectivity provider
- The user
The cloud service provider is based in a data center (which we assume he controls for simplicity), where he has a number of servers running the cloud service being provided (e.g. a CRM system, a remote mail system, remote file repository, etc.). He is responsible for ensuring that the servers are up and running, are available at all times and that there are enough of them to service all the users who have subscribed to the service.
The cloud connectivity provider delivers Internet access connections to the cloud service provider and ensures that the cloud service provider has enough bandwidth for all of the users who wish to access the cloud service simultaneously. He must also ensure that these connections and the bandwidth requested are always available.
The user accesses the service remotely, typically through a web browser over the Internet. He also needs Internet access, which is provided by a connectivity provider (e.g. ISP), but only enough to ensure that he can access the service quickly and without too many delays. The connectivity provider ensures that his connection and required bandwidth is always available.
Which leaves us with the Internet. Who is responsible for this? The connectivity providers will typically have control over their parts of the network, but they must rely on other connectivity service providers to bridge the gap between them. The beauty of the Internet is that they do not have to know about all the actors in the chain of delivery. As long as they have a gateway to the Internet and the destination IP address, then the packets can be directed to the right user and vice versa.
The Internet itself is made up of a number of interconnected networks, often telecom service provider networks, who have implemented IP networks and can provide connectivity across the geographical region where they have licenses to provide services.
This brings the Internet and the cloud within the grasp of virtually everyone.
For cloud services to work, there are four fundamental requirements that need to be met:
- There must be an open, standard access mechanism to the remote service that can allow access from anywhere to anyone who is interested in the service
- This access must have enough bandwidth to ensure quality of experience (i.e. it should feel like the service or application is running on your desktop)
- This access must be secure so that sensitive data is protected
- This access must be available at ALL times
Some of these fundamentals are in place and are driving adoption of cloud services. The Internet and IP networking have grown to a point where it provides the perfect access mechanism. It is a global network, accessible from anywhere as Internet connectivity is now virtually ubiquitous. The bandwidth of the Internet is also not an issue - it is only a question of how much you are willing to pay for your connectivity.
Nevertheless, for users in particular, a modestly priced Internet connection provides all the bandwidth they need to access the cloud services they require.
So far so good!
Cloud service providers are extremely conscious of the fact that availability and security are key requirements and generally ensure that there are redundant servers, failover mechanisms and other solutions to ensure high availability. They also provide trusted security mechanisms to ensure that only the right people get access to sensitive data.
Still on track then!
That leaves the connectivity providers and the Internet itself. This is where more effort is needed.
IP networks and the Internet were designed for efficient transfer of data. The idea is that instead of establishing permanent connections like telephone call connections, where data will follow a pre-determined route every time, the data is routed through on a packet by packet basis on the best route available at the time, as determined by the network itself. There are a number of routes to the same destination, so even if one doesn't work, others will. What you can't guarantee is when data packets will get to the destination or in what order they will get there. If packets don't arrive as expected, then they are simply resent.
This works beautifully for data like web browsing, emails or file transfers, as it doesn't really matter when the data arrives as long as it gets there eventually.
But now, IP networks and the Internet are being used for all sorts of services like Voice-over-IP, Video-over-IP, Storage networks etc. For many of these services, time is critical and a guaranteed bandwidth is required. Many of these services are also sharing the same connections as normal data services, so there also has to be mechanisms to ensure that they are prioritized in relation to data services like those mentioned earlier.
The issue with this for cloud computing is that there is no mechanism for ensuring that a cloud service transported over the Internet will be given priority. In fact, it won't!
Cloud computing is in its nascent stages, but as the popularity of this approach grows, more and more people will access applications and services remotely leading to increased Internet traffic and congestion points as multitudes of users converge on a few critical cloud service provider points.
Up to now, this has not been an issue since there has been a healthy investment in networking capacity and congestion has been solved by "throwing bandwidth at the problem". But in these fiscally challenging times, this is no longer an option. Making more efficient use of the existing infrastructure is the order of the day.
To have confidence that cloud services are available at all times, it is not enough to wait for issues to occur and rely on fallback solutions. The utilization and performance of critical links must be monitored proactively to assure cloud service availability.
This requires dedicated network performance monitoring appliances at all points in the delivery chain. These appliances are stand-alone hardware and software systems that are capable of capturing and analyzing all the data traffic on a link in real-time, at speeds up to 10 Gbps. Each data packet is analyzed to understand where it has come from, where it is going and the application that produced it.
With this information in hand, it is possible to see the utilization of critical links, as well as the applications and users that are hogging the bandwidth.
For cloud service providers, these network appliances can be used to monitor their communication with the outside world, but it also allows them to demand visibility into their connectivity providers' network to understand how their traffic is being transported.
The connectivity provider can use these solutions to support the Service Level Agreements (SLAs) they have with cloud service providers and ensure that there is available bandwidth. They can equally demand the same level of SLA from their other connectivity providers in the Internet domain. Thus the chain continues.
Such network performance tools are available today and being deployed in many enterprise, data center and communication networks. However, they need to be regarded as an essential part of the cloud service delivery infrastructure.
Availability is important for cloud services, but so is security. Cloud service providers provide a number of mechanisms to ensure that only the right persons gain access to critical data.
However, this is not the only threat. Malware, denial of service attacks and other malicious activity are becoming more prevalent. This requires dedicated network security solutions, such as firewalls and intrusion prevention systems that can provide a fence around critical access points. These are primarily at the enterprise and data center where the users and cloud service providers reside, but can also be in the connectivity provider's network securing critical links.
Again, these network security solutions are stand-alone hardware and software systems that are capable of analyzing high-speed data in real-time, taking action and then sending clean data traffic on its way. The process is completely transparent to the user and cloud service provider.
Using these systems ensures that the doors are firmly closed to would-be intruders and should be mandatory at all critical access points in the cloud service delivery chain.
Many pieces of the cloud service delivery chain are in place. What remains are the key components to assure service performance, availability and network security.
Network appliance solutions exist to address these areas and they now have the performance to keep up with even the highest speed networks thanks to advanced network adapters capable of handling data traffic at up to 10 Gbps in real-time without losing packets. What remain is to make these network appliances a mandatory component in the cloud service delivery infrastructure underpinning clear SLAs that can assure performance and security across the delivery chain.
So don't let the cloud rain on your parade! Ensure that all the pieces are in place and enjoy the benefits that the cloud can provide and the new opportunities it will enable.
|Erik Sebesta 12/07/09 11:22:00 AM EST|
You've summarized nicely why we went into business to become the leading cloud computing transition services company. :-)
Hopefully some of our analysis the on cloud computing leaders is helpful to you.
Today, IT is not just a cost center. IT is an enabler and driver of business. With the emergence of the hybrid cloud paradigm, IT now has increasingly more capabilities to create new strategic opportunities for a business. Hybrid cloud allows an organization to utilize multi-tenant public clouds, dedicated private clouds, bare metal hosting, and the associated support and services for the right use cases through an on-demand, XaaS model. This model of IT creates tremendous opportunities for busi...
Jan. 30, 2015 01:30 PM EST Reads: 1,573
The cloud is becoming the de-facto way for enterprises to leverage common infrastructure while innovating and one of the biggest obstacles facing public cloud computing is security. In his session at 15th Cloud Expo, Jeff Aliber, a global marketing executive at Verizon, discussed how the best place for web security is in the cloud. Benefits include: Functions as the first layer of defense Easy operation –CNAME change Implement an integrated solution Best architecture for addressing network-l...
Jan. 30, 2015 01:30 PM EST Reads: 2,212
DevOps Summit 2015 New York, co-located with the 16th International Cloud Expo - to be held June 9-11, 2015, at the Javits Center in New York City, NY - announces that it is now accepting Keynote Proposals. The widespread success of cloud computing is driving the DevOps revolution in enterprise IT. Now as never before, development teams must communicate and collaborate in a dynamic, 24/7/365 environment. There is no time to wait for long development cycles that produce software that is obsolete...
Jan. 30, 2015 01:15 PM EST Reads: 2,590
Cloud computing started a technology revolution; now DevOps is driving that revolution forward. By enabling new approaches to service delivery, cloud and DevOps together are delivering even greater speed, agility, and efficiency. No wonder leading innovators are adopting DevOps and cloud together! In his session at DevOps Summit, Andi Mann, Vice President of Strategic Solutions at CA Technologies, explored the synergies in these two approaches, with practical tips, techniques, research data, wa...
Jan. 30, 2015 01:00 PM EST Reads: 3,131
Software AG and Wipro Ltd. have announced a joint solution platform for streaming analytics that provides real-time actionable intelligence for the Internet of Things (IoT) market. “The key to successfully addressing the IoT market is the ability to rapidly build and evolve apps that tap into, analyze and make smart decisions on fast, big data”, said John Bates, Global Head of Industry Solutions and CMO, Software AG. To address the huge market potential created by streaming analytics in conj...
Jan. 30, 2015 01:00 PM EST Reads: 1,044
Appcore deploys cloud for service providers based on the Apache Cloud set. In this demo at 15th Cloud Expo, Nate Gordon, Director of Technology at Appcore, shows their new product that's coming out in January - Appcore Atlas, which is focused on deploying private clouds based on CloudStack in 15 minutes or less. Our upcoming June 9-11, 2015, event in New York City will present a total of 10 simultaneous tracks (the largest conference content in the world) by an all-star faculty, over three days...
Jan. 30, 2015 01:00 PM EST Reads: 1,542
Amazon, Google and Facebook are household names in part because of their mastery of Big Data. But what about organizations without billions of dollars to spend on Big Data tools - how can they extract value from their data? In his session at 6th Big Data Expo®, Ali Ghodsi, Co-Founder and Head of Engineering at Databricks, discussed how the zero management cost and scalability of the cloud is addressing the challenges and pain points that data engineers face when working with Big Data. He also s...
Jan. 30, 2015 01:00 PM EST Reads: 2,415
In this scenarios approach Joe Thykattil, Technology Architect & Sales at TimeWarner / Navisite, presented examples that will allow business-savvy professionals to make informed decisions based on a sound business model. This model covered the technology options in detail as well as a financial analysis. The TCO (Total Cost of Ownership) and ROI (Return on Investment) demonstrated how to start, develop and formulate a business case that will allow both small and large scale projects to achieve...
Jan. 30, 2015 01:00 PM EST Reads: 2,031
IBM has announced a new strategic technology services agreement with Anthem, Inc., a health benefits company in the U.S. IBM has been selected to provide operational services for Anthem's mainframe and data center server and storage infrastructure for the next five years. Among the benefits of the relationship, Anthem has the ability to leverage IBM Cloud solutions that will help increase the ease, availability and speed of adding infrastructure to support new business requirements.
Jan. 30, 2015 01:00 PM EST Reads: 1,503
CA Technologies released a new study – “DevOps: The Worst-Kept Secret to Winning in the Application Economy” – that reveals that 82% of enterprises in Asia Pacific and Japan (APJ) already have or plan to adopt a DevOps strategy, a 12 point increase from last year’s figure of 70%. DevOps is a methodology which helps foster collaboration between the teams that create and test applications (Dev) with those that maintain them in production environments (Ops). Vanson Bourne conducted the survey with...
Jan. 30, 2015 12:45 PM EST Reads: 487
The term culture has had a polarizing effect among DevOps supporters. Some propose that culture change is critical for success with DevOps, but are remiss to define culture. Some talk about a DevOps culture but then reference activities that could lead to culture change and there are those that talk about culture change as a set of behaviors that need to be adopted by those in IT. There is no question that businesses successful in adopting a DevOps mindset have seen departmental culture change, ...
Jan. 30, 2015 12:45 PM EST Reads: 2,494
In a world of ever-accelerating business cycles and fast-changing client expectations, the cloud increasingly serves as a growth engine and a path to new business models. Dynamic clouds enable businesses to continuously reinvent themselves, adapting their business processes, their service and software delivery and their operations to achieve speed-to-market and quick response to customer feedback. As the cloud evolves, the industry has multiple competing cloud technologies, offering on-premises ...
Jan. 30, 2015 12:15 PM EST Reads: 1,910
Connected devices and the Internet of Things are getting significant momentum in 2014. In his session at Internet of @ThingsExpo, Jim Hunter, Chief Scientist & Technology Evangelist at Greenwave Systems, examined three key elements that together will drive mass adoption of the IoT before the end of 2015. The first element is the recent advent of robust open source protocols (like AllJoyn and WebRTC) that facilitate M2M communication. The second is broad availability of flexible, cost-effective ...
Jan. 30, 2015 12:00 PM EST Reads: 2,630
SYS-CON Events announced today that that Innodisk, the service-driven provider of industrial embedded flash and DRAM storage products and technologies, will exhibit at SYS-CON's 16th International Cloud Expo®, which will take place on June 9-11, 2015, at the Javits Center in New York City, NY. Innodisk is a service-driven provider of industrial embedded flash and DRAM storage products and technologies. With satisfied customers across the embedded, aerospace and defense, cloud storage markets an...
Jan. 30, 2015 12:00 PM EST Reads: 734
The Internet of Things will put IT to its ultimate test by creating infinite new opportunities to digitize products and services, generate and analyze new data to improve customer satisfaction, and discover new ways to gain a competitive advantage across nearly every industry. In order to help corporate business units to capitalize on the rapidly evolving IoT opportunities, IT must stand up to a new set of challenges. In his session at @ThingsExpo, Jeff Kaplan, Managing Director of THINKstrateg...
Jan. 30, 2015 11:45 AM EST Reads: 2,754
"Our premise is Docker is not enough. That's not a bad thing - we actually love Docker. At ActiveState all our products are based on open source technology and Docker is an up-and-coming piece of open source technology," explained Bart Copeland, President & CEO of ActiveState Software, in this SYS-CON.tv interview at DevOps Summit at Cloud Expo®, held Nov 4-6, 2014, at the Santa Clara Convention Center in Santa Clara, CA.
Jan. 30, 2015 11:45 AM EST Reads: 2,928
Eighty-five percent of companies store information in some sort of unstructured manner. In this demo at 15th Cloud Expo, Mark Fronczak, Product Manager at Solgenia, discussed their enterprise content management solution, which was created to help companies organize and take control of their digital assets.
Jan. 30, 2015 11:30 AM EST Reads: 1,729
The BPM world is going through some evolution or changes where traditional business process management solutions really have nowhere to go in terms of development of the road map. In this demo at 15th Cloud Expo, Kyle Hansen, Director of Professional Services at AgilePoint, shows AgilePoint’s unique approach to dealing with this market circumstance by developing a rapid application composition or development framework.
Jan. 30, 2015 11:30 AM EST Reads: 2,290
The Internet of Things will greatly expand the opportunities for data collection and new business models driven off of that data. In her session at @ThingsExpo, Esmeralda Swartz, CMO of MetraTech, discussed how for this to be effective you not only need to have infrastructure and operational models capable of utilizing this new phenomenon, but increasingly service providers will need to convince a skeptical public to participate. Get ready to show them the money!
Jan. 30, 2015 11:30 AM EST Reads: 3,048
Fundamentally, SDN is still mostly about network plumbing. While plumbing may be useful to tinker with, what you can do with your plumbing is far more intriguing. A rigid interpretation of SDN confines it to Layers 2 and 3, and that's reasonable. But SDN opens opportunities for novel constructions in Layers 4 to 7 that solve real operational problems in data centers. "Data center," in fact, might become anachronistic - data is everywhere, constantly on the move, seemingly always overflowing. Net...
Jan. 30, 2015 11:15 AM EST Reads: 2,831