Welcome!

Cloud Expo Authors: Liz McMillan, Roger Strukhoff, Yeshim Deniz, Pat Romanski, Elizabeth White

Related Topics: Adobe Flex, Websphere, PowerBuilder, Weblogic, Oracle, SAP, HP, Red Hat, Cloud Expo, GovIT

Adobe Flex: Article

The Transition to Cloud Computing: What Does It Mean For You?

Availability is important for cloud services, but so is security

Cloud Computing on Ulitzer

We are standing on the threshold of a new transition in information technology and communications; a radical departure from current practice that promises to bring us new levels of efficiency at a vastly reduced cost. Cloud computing is full of potential, bursting with opportunity and within our grasp.

But, remember, that clouds always appear to be within our grasp and bursting clouds promise only one thing: rain!

As with all radical transitions, it takes time for the various pieces to fall into place. Some of them are already in place; some of them have yet to be considered. In this article, we will take a look at both and try to gauge where we are today and what work still remains. In addition, we will try to understand what this means to the various stakeholders involved.

Cloud composition
So what is the cloud and who are the stakeholders involved. There are many definitions available, but in simple terms, cloud computing involves providing an information technology service that is accessed remotely. This access can be over a public or private infrastructure, but for our purposes, it is probably useful to consider the Internet as a reference delivery infrastructure.

With this in mind, a simple cloud model would include the following stakeholders:

  • The cloud service provider
  • The cloud connectivity provider
  • The Internet
  • The user connectivity provider
  • The user

The cloud service provider is based in a data center (which we assume he controls for simplicity), where he has a number of servers running the cloud service being provided (e.g. a CRM system, a remote mail system, remote file repository, etc.). He is responsible for ensuring that the servers are up and running, are available at all times and that there are enough of them to service all the users who have subscribed to the service.

The cloud connectivity provider delivers Internet access connections to the cloud service provider and ensures that the cloud service provider has enough bandwidth for all of the users who wish to access the cloud service simultaneously. He must also ensure that these connections and the bandwidth requested are always available.

The user accesses the service remotely, typically through a web browser over the Internet. He also needs Internet access, which is provided by a connectivity provider (e.g. ISP), but only enough to ensure that he can access the service quickly and without too many delays. The connectivity provider ensures that his connection and required bandwidth is always available.

Which leaves us with the Internet. Who is responsible for this? The connectivity providers will typically have control over their parts of the network, but they must rely on other connectivity service providers to bridge the gap between them. The beauty of the Internet is that they do not have to know about all the actors in the chain of delivery. As long as they have a gateway to the Internet and the destination IP address, then the packets can be directed to the right user and vice versa.

The Internet itself is made up of a number of interconnected networks, often telecom service provider networks, who have implemented IP networks and can provide connectivity across the geographical region where they have licenses to provide services.

This brings the Internet and the cloud within the grasp of virtually everyone.

Cloud considerations
For cloud services to work, there are four fundamental requirements that need to be met:

  • There must be an open, standard access mechanism to the remote service that can allow access from anywhere to anyone who is interested in the service
  • This access must have enough bandwidth to ensure quality of experience (i.e. it should feel like the service or application is running on your desktop)
  • This access must be secure so that sensitive data is protected
  • This access must be available at ALL times

Some of these fundamentals are in place and are driving adoption of cloud services. The Internet and IP networking have grown to a point where it provides the perfect access mechanism. It is a global network, accessible from anywhere as Internet connectivity is now virtually ubiquitous. The bandwidth of the Internet is also not an issue - it is only a question of how much you are willing to pay for your connectivity.

Nevertheless, for users in particular, a modestly priced Internet connection provides all the bandwidth they need to access the cloud services they require.

So far so good!

Cloud service providers are extremely conscious of the fact that availability and security are key requirements and generally ensure that there are redundant servers, failover mechanisms and other solutions to ensure high availability. They also provide trusted security mechanisms to ensure that only the right people get access to sensitive data.

Still on track then!

That leaves the connectivity providers and the Internet itself. This is where more effort is needed.

Cloud compromised
IP networks and the Internet were designed for efficient transfer of data. The idea is that instead of establishing permanent connections like telephone call connections, where data will follow a pre-determined route every time, the data is routed through on a packet by packet basis on the best route available at the time, as determined by the network itself. There are a number of routes to the same destination, so even if one doesn't work, others will. What you can't guarantee is when data packets will get to the destination or in what order they will get there. If packets don't arrive as expected, then they are simply resent.

This works beautifully for data like web browsing, emails or file transfers, as it doesn't really matter when the data arrives as long as it gets there eventually.

But now, IP networks and the Internet are being used for all sorts of services like Voice-over-IP, Video-over-IP, Storage networks etc. For many of these services, time is critical and a guaranteed bandwidth is required. Many of these services are also sharing the same connections as normal data services, so there also has to be mechanisms to ensure that they are prioritized in relation to data services like those mentioned earlier.

The issue with this for cloud computing is that there is no mechanism for ensuring that a cloud service transported over the Internet will be given priority. In fact, it won't!

Cloud computing is in its nascent stages, but as the popularity of this approach grows, more and more people will access applications and services remotely leading to increased Internet traffic and congestion points as multitudes of users converge on a few critical cloud service provider points.

Up to now, this has not been an issue since there has been a healthy investment in networking capacity and congestion has been solved by "throwing bandwidth at the problem". But in these fiscally challenging times, this is no longer an option. Making more efficient use of the existing infrastructure is the order of the day.

Cloud certainty
To have confidence that cloud services are available at all times, it is not enough to wait for issues to occur and rely on fallback solutions. The utilization and performance of critical links must be monitored proactively to assure cloud service availability.

This requires dedicated network performance monitoring appliances at all points in the delivery chain. These appliances are stand-alone hardware and software systems that are capable of capturing and analyzing all the data traffic on a link in real-time, at speeds up to 10 Gbps. Each data packet is analyzed to understand where it has come from, where it is going and the application that produced it.

With this information in hand, it is possible to see the utilization of critical links, as well as the applications and users that are hogging the bandwidth.

For cloud service providers, these network appliances can be used to monitor their communication with the outside world, but it also allows them to demand visibility into their connectivity providers' network to understand how their traffic is being transported.

The connectivity provider can use these solutions to support the Service Level Agreements (SLAs) they have with cloud service providers and ensure that there is available bandwidth. They can equally demand the same level of SLA from their other connectivity providers in the Internet domain. Thus the chain continues.

Such network performance tools are available today and being deployed in many enterprise, data center and communication networks. However, they need to be regarded as an essential part of the cloud service delivery infrastructure.

Cloud confidence
Availability is important for cloud services, but so is security. Cloud service providers provide a number of mechanisms to ensure that only the right persons gain access to critical data.

However, this is not the only threat. Malware, denial of service attacks and other malicious activity are becoming more prevalent. This requires dedicated network security solutions, such as firewalls and intrusion prevention systems that can provide a fence around critical access points. These are primarily at the enterprise and data center where the users and cloud service providers reside, but can also be in the connectivity provider's network securing critical links.

Again, these network security solutions are stand-alone hardware and software systems that are capable of analyzing high-speed data in real-time, taking action and then sending clean data traffic on its way. The process is completely transparent to the user and cloud service provider.

Using these systems ensures that the doors are firmly closed to would-be intruders and should be mandatory at all critical access points in the cloud service delivery chain.

Cloud clarity
Many pieces of the cloud service delivery chain are in place. What remains are the key components to assure service performance, availability and network security.

Network appliance solutions exist to address these areas and they now have the performance to keep up with even the highest speed networks thanks to advanced network adapters capable of handling data traffic at up to 10 Gbps in real-time without losing packets. What remain is to make these network appliances a mandatory component in the cloud service delivery infrastructure underpinning clear SLAs that can assure performance and security across the delivery chain.

So don't let the cloud rain on your parade! Ensure that all the pieces are in place and enjoy the benefits that the cloud can provide and the new opportunities it will enable.

More Stories By Dan Joe Barry

Dan Joe Barry is VP of Marketing at Napatech. Napatech develops and markets the world's most advanced programmable network adapters for network traffic analysis and application off-loading. Napatech is the leading OEM supplier of Ethernet network acceleration adapter hardware with an installed base of more than 140,000 ports.

Comments (1) View Comments

Share your thoughts on this story.

Add your comment
You must be signed in to add a comment. Sign-in | Register

In accordance with our Comment Policy, we encourage comments that are on topic, relevant and to-the-point. We will remove comments that include profanity, personal attacks, racial slurs, threats of violence, or other inappropriate material that violates our Terms and Conditions, and will block users who make repeated violations. We ask all readers to expect diversity of opinion and to treat one another with dignity and respect.


Most Recent Comments
Erik Sebesta 12/07/09 11:22:00 AM EST

You've summarized nicely why we went into business to become the leading cloud computing transition services company. :-)

Cloud Computing Services

Hopefully some of our analysis the on cloud computing leaders is helpful to you.

Cloud computing leaders

Cheers,

--Erik

Cloud Expo Latest Stories
Come learn about what you need to consider when moving your data to the cloud. In her session at 15th Cloud Expo, Skyla Loomis, a Program Director of Cloudant Development at Cloudant, will discuss the security, performance, and operational implications of keeping your data on premise, moving it to the cloud, or taking a hybrid approach. She will use real customer examples to illustrate the tradeoffs, key decision points, and how to be successful with a cloud or hybrid cloud solution.
In today's application economy, enterprise organizations realize that it's their applications that are the heart and soul of their business. If their application users have a bad experience, their revenue and reputation are at stake. In his session at 15th Cloud Expo, Anand Akela, Senior Director of Product Marketing for Application Performance Management at CA Technologies, will discuss how a user-centric Application Performance Management solution can help inspire your users with every application transaction.
With the explosion of the cloud, more businesses are transitioning to a recurring revenue model to generate reliable sales, grow profits, and open new markets. This opportunity requires businesses to get to market quickly with the pricing and packaging options customers want. In addition, you will want to take advantage of the ensuing tidal wave of data to more effectively upsell, cross-sell and manage your customers. All of this is possible, but only with the right approach. At 15th Cloud Expo, Brendan O'Brien, Co-founder at Aria Systems and the inventor of cloud billing panelists, will lead a panel discussion on what it takes to launch and manage a successful recurring revenue business. The panelists will offer their insights about what each department will need to consider, from financial management to line of business and IT. The panelists will also offer examples from their success in recurring revenue with companies such as Audi, Constant Contact, Experian, Pitney-Bowes, Teleko...
Planning scalable environments isn't terribly difficult, but it does require a change of perspective. In his session at 15th Cloud Expo, Phil Jackson, Development Community Advocate for SoftLayer, will broaden your views to think on an Internet scale by dissecting a video publishing application built with The SoftLayer Platform, Message Queuing, Object Storage, and Drupal. By examining a scalable modular application build that can handle unpredictable traffic, attendees will able to grow your development arsenal and pick up a few strategies to apply to your own projects.
The cloud provides an easy onramp to building and deploying Big Data solutions. Transitioning from initial deployment to large-scale, highly performant operations may not be as easy. In his session at 15th Cloud Expo, Harold Hannon, Sr. Software Architect at SoftLayer, will discuss the benefits, weaknesses, and performance characteristics of public and bare metal cloud deployments that can help you make the right decisions.
Over the last few years the healthcare ecosystem has revolved around innovations in Electronic Health Record (HER) based systems. This evolution has helped us achieve much desired interoperability. Now the focus is shifting to other equally important aspects – scalability and performance. While applying cloud computing environments to the EHR systems, a special consideration needs to be given to the cloud enablement of Veterans Health Information Systems and Technology Architecture (VistA), i.e., the largest single medical system in the United States.
Cloud and Big Data present unique dilemmas: embracing the benefits of these new technologies while maintaining the security of your organization’s assets. When an outside party owns, controls and manages your infrastructure and computational resources, how can you be assured that sensitive data remains private and secure? How do you best protect data in mixed use cloud and big data infrastructure sets? Can you still satisfy the full range of reporting, compliance and regulatory requirements? In his session at 15th Cloud Expo, Derek Tumulak, Vice President of Product Management at Vormetric, will discuss how to address data security in cloud and Big Data environments so that your organization isn’t next week’s data breach headline.
Scott Jenson leads a project called The Physical Web within the Chrome team at Google. Project members are working to take the scalability and openness of the web and use it to talk to the exponentially exploding range of smart devices. Nearly every company today working on the IoT comes up with the same basic solution: use my server and you'll be fine. But if we really believe there will be trillions of these devices, that just can't scale. We need a system that is open a scalable and by using the URL as a basic building block, we open this up and get the same resilience that the web enjoys.
Is your organization struggling to deal with skyrocketing volumes of digital assets? The amount of data is growing exponentially and organizations are having a hard time managing this growth. In his session at 15th Cloud Expo, Amar Kapadia, Senior Director of Open Cloud Strategy at Seagate, will walk through the essential considerations when developing a cloud storage strategy. In this discussion, you will understand the challenges IT is facing, why companies need to move to cloud, and how the right cloud model can help your business economically overcome the data struggle.
If cloud computing benefits are so clear, why have so few enterprises migrated their mission-critical apps? The answer is often inertia and FUD. No one ever got fired for not moving to the cloud – not yet. In his session at 15th Cloud Expo, Michael Hoch, SVP, Cloud Advisory Service at Virtustream, will discuss the six key steps to justify and execute your MCA cloud migration.
The 16th International Cloud Expo announces that its Call for Papers is now open. 16th International Cloud Expo, to be held June 9–11, 2015, at the Javits Center in New York City brings together Cloud Computing, APM, APIs, Security, Big Data, Internet of Things, DevOps and WebRTC to one location. With cloud computing driving a higher percentage of enterprise IT budgets every year, it becomes increasingly important to plant your flag in this fast-expanding business opportunity. Submit your speaking proposal today!
Most of today’s hardware manufacturers are building servers with at least one SATA Port, but not every systems engineer utilizes them. This is considered a loss in the game of maximizing potential storage space in a fixed unit. The SATADOM Series was created by Innodisk as a high-performance, small form factor boot drive with low power consumption to be plugged into the unused SATA port on your server board as an alternative to hard drive or USB boot-up. Built for 1U systems, this powerful device is smaller than a one dollar coin, and frees up otherwise dead space on your motherboard. To meet the requirements of tomorrow’s cloud hardware, Innodisk invested internal R&D resources to develop our SATA III series of products. The SATA III SATADOM boasts 500/180MBs R/W Speeds respectively, or double R/W Speed of SATA II products.
SYS-CON Events announced today that Gridstore™, the leader in software-defined storage (SDS) purpose-built for Windows Servers and Hyper-V, will exhibit at SYS-CON's 15th International Cloud Expo®, which will take place on November 4–6, 2014, at the Santa Clara Convention Center in Santa Clara, CA. Gridstore™ is the leader in software-defined storage purpose built for virtualization that is designed to accelerate applications in virtualized environments. Using its patented Server-Side Virtual Controller™ Technology (SVCT) to eliminate the I/O blender effect and accelerate applications Gridstore delivers vmOptimized™ Storage that self-optimizes to each application or VM across both virtual and physical environments. Leveraging a grid architecture, Gridstore delivers the first end-to-end storage QoS to ensure the most important App or VM performance is never compromised. The storage grid, that uses Gridstore’s performance optimized nodes or capacity optimized nodes, starts with as few a...
SYS-CON Events announced today that Cloudian, Inc., the leading provider of hybrid cloud storage solutions, has been named “Bronze Sponsor” of SYS-CON's 15th International Cloud Expo®, which will take place on November 4–6, 2014, at the Santa Clara Convention Center in Santa Clara, CA. Cloudian is a Foster City, Calif.-based software company specializing in cloud storage. Cloudian HyperStore® is an S3-compatible cloud object storage platform that enables service providers and enterprises to build reliable, affordable and scalable hybrid cloud storage solutions. Cloudian actively partners with leading cloud computing environments including Amazon Web Services, Citrix Cloud Platform, Apache CloudStack, OpenStack and the vast ecosystem of S3 compatible tools and applications. Cloudian's customers include Vodafone, Nextel, NTT, Nifty, and LunaCloud. The company has additional offices in China and Japan.
SYS-CON Events announced today that TechXtend (formerly Programmer’s Paradise), a leading value-added provider of server and storage virtualization, and r-evolution will exhibit at SYS-CON's 15th International Cloud Expo®, which will take place on November 4–6, 2014, at the Santa Clara Convention Center in Santa Clara, CA. TechXtend (formerly Programmer’s Paradise) is a leading value-added provider of software, systems and solutions for corporations, government organizations, and academic institutions across the United States and Canada. TechXtend is the Exclusive Reseller in the United States for r-evolution