Click here to close now.


@CloudExpo Authors: Yeshim Deniz, David Dodd, Betty Zakheim, Steve Watts, Bill Szybillo

Related Topics: Adobe Flex, IBM Cloud, PowerBuilder, Weblogic, Recurring Revenue, SAP HANA Cloud, Log Management, Server Monitoring, @CloudExpo, Government Cloud

Adobe Flex: Article

The Transition to Cloud Computing: What Does It Mean For You?

Availability is important for cloud services, but so is security

Cloud Computing on Ulitzer

We are standing on the threshold of a new transition in information technology and communications; a radical departure from current practice that promises to bring us new levels of efficiency at a vastly reduced cost. Cloud computing is full of potential, bursting with opportunity and within our grasp.

But, remember, that clouds always appear to be within our grasp and bursting clouds promise only one thing: rain!

As with all radical transitions, it takes time for the various pieces to fall into place. Some of them are already in place; some of them have yet to be considered. In this article, we will take a look at both and try to gauge where we are today and what work still remains. In addition, we will try to understand what this means to the various stakeholders involved.

Cloud composition
So what is the cloud and who are the stakeholders involved. There are many definitions available, but in simple terms, cloud computing involves providing an information technology service that is accessed remotely. This access can be over a public or private infrastructure, but for our purposes, it is probably useful to consider the Internet as a reference delivery infrastructure.

With this in mind, a simple cloud model would include the following stakeholders:

  • The cloud service provider
  • The cloud connectivity provider
  • The Internet
  • The user connectivity provider
  • The user

The cloud service provider is based in a data center (which we assume he controls for simplicity), where he has a number of servers running the cloud service being provided (e.g. a CRM system, a remote mail system, remote file repository, etc.). He is responsible for ensuring that the servers are up and running, are available at all times and that there are enough of them to service all the users who have subscribed to the service.

The cloud connectivity provider delivers Internet access connections to the cloud service provider and ensures that the cloud service provider has enough bandwidth for all of the users who wish to access the cloud service simultaneously. He must also ensure that these connections and the bandwidth requested are always available.

The user accesses the service remotely, typically through a web browser over the Internet. He also needs Internet access, which is provided by a connectivity provider (e.g. ISP), but only enough to ensure that he can access the service quickly and without too many delays. The connectivity provider ensures that his connection and required bandwidth is always available.

Which leaves us with the Internet. Who is responsible for this? The connectivity providers will typically have control over their parts of the network, but they must rely on other connectivity service providers to bridge the gap between them. The beauty of the Internet is that they do not have to know about all the actors in the chain of delivery. As long as they have a gateway to the Internet and the destination IP address, then the packets can be directed to the right user and vice versa.

The Internet itself is made up of a number of interconnected networks, often telecom service provider networks, who have implemented IP networks and can provide connectivity across the geographical region where they have licenses to provide services.

This brings the Internet and the cloud within the grasp of virtually everyone.

Cloud considerations
For cloud services to work, there are four fundamental requirements that need to be met:

  • There must be an open, standard access mechanism to the remote service that can allow access from anywhere to anyone who is interested in the service
  • This access must have enough bandwidth to ensure quality of experience (i.e. it should feel like the service or application is running on your desktop)
  • This access must be secure so that sensitive data is protected
  • This access must be available at ALL times

Some of these fundamentals are in place and are driving adoption of cloud services. The Internet and IP networking have grown to a point where it provides the perfect access mechanism. It is a global network, accessible from anywhere as Internet connectivity is now virtually ubiquitous. The bandwidth of the Internet is also not an issue - it is only a question of how much you are willing to pay for your connectivity.

Nevertheless, for users in particular, a modestly priced Internet connection provides all the bandwidth they need to access the cloud services they require.

So far so good!

Cloud service providers are extremely conscious of the fact that availability and security are key requirements and generally ensure that there are redundant servers, failover mechanisms and other solutions to ensure high availability. They also provide trusted security mechanisms to ensure that only the right people get access to sensitive data.

Still on track then!

That leaves the connectivity providers and the Internet itself. This is where more effort is needed.

Cloud compromised
IP networks and the Internet were designed for efficient transfer of data. The idea is that instead of establishing permanent connections like telephone call connections, where data will follow a pre-determined route every time, the data is routed through on a packet by packet basis on the best route available at the time, as determined by the network itself. There are a number of routes to the same destination, so even if one doesn't work, others will. What you can't guarantee is when data packets will get to the destination or in what order they will get there. If packets don't arrive as expected, then they are simply resent.

This works beautifully for data like web browsing, emails or file transfers, as it doesn't really matter when the data arrives as long as it gets there eventually.

But now, IP networks and the Internet are being used for all sorts of services like Voice-over-IP, Video-over-IP, Storage networks etc. For many of these services, time is critical and a guaranteed bandwidth is required. Many of these services are also sharing the same connections as normal data services, so there also has to be mechanisms to ensure that they are prioritized in relation to data services like those mentioned earlier.

The issue with this for cloud computing is that there is no mechanism for ensuring that a cloud service transported over the Internet will be given priority. In fact, it won't!

Cloud computing is in its nascent stages, but as the popularity of this approach grows, more and more people will access applications and services remotely leading to increased Internet traffic and congestion points as multitudes of users converge on a few critical cloud service provider points.

Up to now, this has not been an issue since there has been a healthy investment in networking capacity and congestion has been solved by "throwing bandwidth at the problem". But in these fiscally challenging times, this is no longer an option. Making more efficient use of the existing infrastructure is the order of the day.

Cloud certainty
To have confidence that cloud services are available at all times, it is not enough to wait for issues to occur and rely on fallback solutions. The utilization and performance of critical links must be monitored proactively to assure cloud service availability.

This requires dedicated network performance monitoring appliances at all points in the delivery chain. These appliances are stand-alone hardware and software systems that are capable of capturing and analyzing all the data traffic on a link in real-time, at speeds up to 10 Gbps. Each data packet is analyzed to understand where it has come from, where it is going and the application that produced it.

With this information in hand, it is possible to see the utilization of critical links, as well as the applications and users that are hogging the bandwidth.

For cloud service providers, these network appliances can be used to monitor their communication with the outside world, but it also allows them to demand visibility into their connectivity providers' network to understand how their traffic is being transported.

The connectivity provider can use these solutions to support the Service Level Agreements (SLAs) they have with cloud service providers and ensure that there is available bandwidth. They can equally demand the same level of SLA from their other connectivity providers in the Internet domain. Thus the chain continues.

Such network performance tools are available today and being deployed in many enterprise, data center and communication networks. However, they need to be regarded as an essential part of the cloud service delivery infrastructure.

Cloud confidence
Availability is important for cloud services, but so is security. Cloud service providers provide a number of mechanisms to ensure that only the right persons gain access to critical data.

However, this is not the only threat. Malware, denial of service attacks and other malicious activity are becoming more prevalent. This requires dedicated network security solutions, such as firewalls and intrusion prevention systems that can provide a fence around critical access points. These are primarily at the enterprise and data center where the users and cloud service providers reside, but can also be in the connectivity provider's network securing critical links.

Again, these network security solutions are stand-alone hardware and software systems that are capable of analyzing high-speed data in real-time, taking action and then sending clean data traffic on its way. The process is completely transparent to the user and cloud service provider.

Using these systems ensures that the doors are firmly closed to would-be intruders and should be mandatory at all critical access points in the cloud service delivery chain.

Cloud clarity
Many pieces of the cloud service delivery chain are in place. What remains are the key components to assure service performance, availability and network security.

Network appliance solutions exist to address these areas and they now have the performance to keep up with even the highest speed networks thanks to advanced network adapters capable of handling data traffic at up to 10 Gbps in real-time without losing packets. What remain is to make these network appliances a mandatory component in the cloud service delivery infrastructure underpinning clear SLAs that can assure performance and security across the delivery chain.

So don't let the cloud rain on your parade! Ensure that all the pieces are in place and enjoy the benefits that the cloud can provide and the new opportunities it will enable.

More Stories By Dan Joe Barry

Dan Joe Barry is VP of Marketing at Napatech. Napatech develops and markets the world's most advanced programmable network adapters for network traffic analysis and application off-loading. Napatech is the leading OEM supplier of Ethernet network acceleration adapter hardware with an installed base of more than 140,000 ports.

Comments (1) View Comments

Share your thoughts on this story.

Add your comment
You must be signed in to add a comment. Sign-in | Register

In accordance with our Comment Policy, we encourage comments that are on topic, relevant and to-the-point. We will remove comments that include profanity, personal attacks, racial slurs, threats of violence, or other inappropriate material that violates our Terms and Conditions, and will block users who make repeated violations. We ask all readers to expect diversity of opinion and to treat one another with dignity and respect.

Most Recent Comments
Erik Sebesta 12/07/09 11:22:00 AM EST

You've summarized nicely why we went into business to become the leading cloud computing transition services company. :-)

Cloud Computing Services

Hopefully some of our analysis the on cloud computing leaders is helpful to you.

Cloud computing leaders



@CloudExpo Stories
SYS-CON Events announced today that Catchpoint, a global leader in monitoring, and testing the performance of online applications, has been named "Silver Sponsor" of DevOps Summit New York, which will take place on June 7-9, 2016 at the Javits Center in New York City. Catchpoint radically transforms the way businesses manage, monitor, and test the performance of online applications. Truly understand and improve user experience with clear visibility into complex, distributed online systems.Founde...
In today's enterprise, digital transformation represents organizational change even more so than technology change, as customer preferences and behavior drive end-to-end transformation across lines of business as well as IT. To capitalize on the ubiquitous disruption driving this transformation, companies must be able to innovate at an increasingly rapid pace. Traditional approaches for driving innovation are now woefully inadequate for keeping up with the breadth of disruption and change facin...
I recently attended and was a speaker at the 4th International Internet of @ThingsExpo at the Santa Clara Convention Center. I also had the opportunity to attend this event last year and I wrote a blog from that show talking about how the “Enterprise Impact of IoT” was a key theme of last year’s show. I was curious to see if the same theme would still resonate 365 days later and what, if any, changes I would see in the content presented.
The revocation of Safe Harbor has radically affected data sovereignty strategy in the cloud. In his session at 17th Cloud Expo, Jeff Miller, Product Management at Cavirin Systems, discussed how to assess these changes across your own cloud strategy, and how you can mitigate risks previously covered under the agreement.
Cloud computing delivers on-demand resources that provide businesses with flexibility and cost-savings. The challenge in moving workloads to the cloud has been the cost and complexity of ensuring the initial and ongoing security and regulatory (PCI, HIPAA, FFIEC) compliance across private and public clouds. Manual security compliance is slow, prone to human error, and represents over 50% of the cost of managing cloud applications. Determining how to automate cloud security compliance is critical...
Most of the IoT Gateway scenarios involve collecting data from machines/processing and pushing data upstream to cloud for further analytics. The gateway hardware varies from Raspberry Pi to Industrial PCs. The document states the process of allowing deploying polyglot data pipelining software with the clear notion of supporting immutability. In his session at @ThingsExpo, Shashank Jain, a development architect for SAP Labs, discussed the objective, which is to automate the IoT deployment proces...
Culture is the most important ingredient of DevOps. The challenge for most organizations is defining and communicating a vision of beneficial DevOps culture for their organizations, and then facilitating the changes needed to achieve that. Often this comes down to an ability to provide true leadership. As a CIO, are your direct reports IT managers or are they IT leaders? The hard truth is that many IT managers have risen through the ranks based on their technical skills, not their leadership ab...
In his General Session at DevOps Summit, Asaf Yigal, Co-Founder & VP of Product at, explored the value of Kibana 4 for log analysis and provided a hands-on tutorial on how to set up Kibana 4 and get the most out of Apache log files. He examined three use cases: IT operations, business intelligence, and security and compliance. Asaf Yigal is co-founder and VP of Product at log analytics software company In the past, he was co-founder of social-trading platform Currensee, which...
Countless business models have spawned from the IaaS industry – resell Web hosting, blogs, public cloud, and on and on. With the overwhelming amount of tools available to us, it's sometimes easy to overlook that many of them are just new skins of resources we've had for a long time. In his general session at 17th Cloud Expo, Harold Hannon, Sr. Software Architect at SoftLayer, an IBM Company, broke down what we have to work with, discussed the benefits and pitfalls and how we can best use them ...
We all know that data growth is exploding and storage budgets are shrinking. Instead of showing you charts on about how much data there is, in his General Session at 17th Cloud Expo, Scott Cleland, Senior Director of Product Marketing at HGST, showed how to capture all of your data in one place. After you have your data under control, you can then analyze it in one place, saving time and resources.
The Internet of Things (IoT) is growing rapidly by extending current technologies, products and networks. By 2020, Cisco estimates there will be 50 billion connected devices. Gartner has forecast revenues of over $300 billion, just to IoT suppliers. Now is the time to figure out how you’ll make money – not just create innovative products. With hundreds of new products and companies jumping into the IoT fray every month, there’s no shortage of innovation. Despite this, McKinsey/VisionMobile data...
Just over a week ago I received a long and loud sustained applause for a presentation I delivered at this year’s Cloud Expo in Santa Clara. I was extremely pleased with the turnout and had some very good conversations with many of the attendees. Over the next few days I had many more meaningful conversations and was not only happy with the results but also learned a few new things. Here is everything I learned in those three days distilled into three short points.
DevOps is about increasing efficiency, but nothing is more inefficient than building the same application twice. However, this is a routine occurrence with enterprise applications that need both a rich desktop web interface and strong mobile support. With recent technological advances from Isomorphic Software and others, rich desktop and tuned mobile experiences can now be created with a single codebase – without compromising functionality, performance or usability. In his session at DevOps Su...
As organizations realize the scope of the Internet of Things, gaining key insights from Big Data, through the use of advanced analytics, becomes crucial. However, IoT also creates the need for petabyte scale storage of data from millions of devices. A new type of Storage is required which seamlessly integrates robust data analytics with massive scale. These storage systems will act as “smart systems” provide in-place analytics that speed discovery and enable businesses to quickly derive meaningf...
In his keynote at @ThingsExpo, Chris Matthieu, Director of IoT Engineering at Citrix and co-founder and CTO of Octoblu, focused on building an IoT platform and company. He provided a behind-the-scenes look at Octoblu’s platform, business, and pivots along the way (including the Citrix acquisition of Octoblu).
The buzz continues for cloud, data analytics and the Internet of Things (IoT) and their collective impact across all industries. But a new conversation is emerging - how do companies use industry disruption and technology enablers to lead in markets undergoing change, uncertainty and ambiguity? Organizations of all sizes need to evolve and transform, often under massive pressure, as industry lines blur and merge and traditional business models are assaulted and turned upside down. In this new da...
In his General Session at 17th Cloud Expo, Bruce Swann, Senior Product Marketing Manager for Adobe Campaign, explored the key ingredients of cross-channel marketing in a digital world. Learn how the Adobe Marketing Cloud can help marketers embrace opportunities for personalized, relevant and real-time customer engagement across offline (direct mail, point of sale, call center) and digital (email, website, SMS, mobile apps, social networks, connected objects).
In recent years, at least 40% of companies using cloud applications have experienced data loss. One of the best prevention against cloud data loss is backing up your cloud data. In his General Session at 17th Cloud Expo, Sam McIntyre, Partner Enablement Specialist at eFolder, presented how organizations can use eFolder Cloudfinder to automate backups of cloud application data. He also demonstrated how easy it is to search and restore cloud application data using Cloudfinder.
The Internet of Everything is re-shaping technology trends–moving away from “request/response” architecture to an “always-on” Streaming Web where data is in constant motion and secure, reliable communication is an absolute necessity. As more and more THINGS go online, the challenges that developers will need to address will only increase exponentially. In his session at @ThingsExpo, Todd Greene, Founder & CEO of PubNub, exploreed the current state of IoT connectivity and review key trends and t...
With all the incredible momentum behind the Internet of Things (IoT) industry, it is easy to forget that not a single CEO wakes up and wonders if “my IoT is broken.” What they wonder is if they are making the right decisions to do all they can to increase revenue, decrease costs, and improve customer experience – effectively the same challenges they have always had in growing their business. The exciting thing about the IoT industry is now these decisions can be better, faster, and smarter. Now ...