Welcome!

@CloudExpo Authors: Elizabeth White, Liz McMillan, Pat Romanski, Bob Gourley, Sematext Blog

Related Topics: Adobe Flex, IBM Cloud, PowerBuilder, Weblogic, Recurring Revenue, SAP HANA Cloud, Log Management, Server Monitoring, @CloudExpo, Government Cloud

Adobe Flex: Article

The Transition to Cloud Computing: What Does It Mean For You?

Availability is important for cloud services, but so is security

Cloud Computing on Ulitzer

We are standing on the threshold of a new transition in information technology and communications; a radical departure from current practice that promises to bring us new levels of efficiency at a vastly reduced cost. Cloud computing is full of potential, bursting with opportunity and within our grasp.

But, remember, that clouds always appear to be within our grasp and bursting clouds promise only one thing: rain!

As with all radical transitions, it takes time for the various pieces to fall into place. Some of them are already in place; some of them have yet to be considered. In this article, we will take a look at both and try to gauge where we are today and what work still remains. In addition, we will try to understand what this means to the various stakeholders involved.

Cloud composition
So what is the cloud and who are the stakeholders involved. There are many definitions available, but in simple terms, cloud computing involves providing an information technology service that is accessed remotely. This access can be over a public or private infrastructure, but for our purposes, it is probably useful to consider the Internet as a reference delivery infrastructure.

With this in mind, a simple cloud model would include the following stakeholders:

  • The cloud service provider
  • The cloud connectivity provider
  • The Internet
  • The user connectivity provider
  • The user

The cloud service provider is based in a data center (which we assume he controls for simplicity), where he has a number of servers running the cloud service being provided (e.g. a CRM system, a remote mail system, remote file repository, etc.). He is responsible for ensuring that the servers are up and running, are available at all times and that there are enough of them to service all the users who have subscribed to the service.

The cloud connectivity provider delivers Internet access connections to the cloud service provider and ensures that the cloud service provider has enough bandwidth for all of the users who wish to access the cloud service simultaneously. He must also ensure that these connections and the bandwidth requested are always available.

The user accesses the service remotely, typically through a web browser over the Internet. He also needs Internet access, which is provided by a connectivity provider (e.g. ISP), but only enough to ensure that he can access the service quickly and without too many delays. The connectivity provider ensures that his connection and required bandwidth is always available.

Which leaves us with the Internet. Who is responsible for this? The connectivity providers will typically have control over their parts of the network, but they must rely on other connectivity service providers to bridge the gap between them. The beauty of the Internet is that they do not have to know about all the actors in the chain of delivery. As long as they have a gateway to the Internet and the destination IP address, then the packets can be directed to the right user and vice versa.

The Internet itself is made up of a number of interconnected networks, often telecom service provider networks, who have implemented IP networks and can provide connectivity across the geographical region where they have licenses to provide services.

This brings the Internet and the cloud within the grasp of virtually everyone.

Cloud considerations
For cloud services to work, there are four fundamental requirements that need to be met:

  • There must be an open, standard access mechanism to the remote service that can allow access from anywhere to anyone who is interested in the service
  • This access must have enough bandwidth to ensure quality of experience (i.e. it should feel like the service or application is running on your desktop)
  • This access must be secure so that sensitive data is protected
  • This access must be available at ALL times

Some of these fundamentals are in place and are driving adoption of cloud services. The Internet and IP networking have grown to a point where it provides the perfect access mechanism. It is a global network, accessible from anywhere as Internet connectivity is now virtually ubiquitous. The bandwidth of the Internet is also not an issue - it is only a question of how much you are willing to pay for your connectivity.

Nevertheless, for users in particular, a modestly priced Internet connection provides all the bandwidth they need to access the cloud services they require.

So far so good!

Cloud service providers are extremely conscious of the fact that availability and security are key requirements and generally ensure that there are redundant servers, failover mechanisms and other solutions to ensure high availability. They also provide trusted security mechanisms to ensure that only the right people get access to sensitive data.

Still on track then!

That leaves the connectivity providers and the Internet itself. This is where more effort is needed.

Cloud compromised
IP networks and the Internet were designed for efficient transfer of data. The idea is that instead of establishing permanent connections like telephone call connections, where data will follow a pre-determined route every time, the data is routed through on a packet by packet basis on the best route available at the time, as determined by the network itself. There are a number of routes to the same destination, so even if one doesn't work, others will. What you can't guarantee is when data packets will get to the destination or in what order they will get there. If packets don't arrive as expected, then they are simply resent.

This works beautifully for data like web browsing, emails or file transfers, as it doesn't really matter when the data arrives as long as it gets there eventually.

But now, IP networks and the Internet are being used for all sorts of services like Voice-over-IP, Video-over-IP, Storage networks etc. For many of these services, time is critical and a guaranteed bandwidth is required. Many of these services are also sharing the same connections as normal data services, so there also has to be mechanisms to ensure that they are prioritized in relation to data services like those mentioned earlier.

The issue with this for cloud computing is that there is no mechanism for ensuring that a cloud service transported over the Internet will be given priority. In fact, it won't!

Cloud computing is in its nascent stages, but as the popularity of this approach grows, more and more people will access applications and services remotely leading to increased Internet traffic and congestion points as multitudes of users converge on a few critical cloud service provider points.

Up to now, this has not been an issue since there has been a healthy investment in networking capacity and congestion has been solved by "throwing bandwidth at the problem". But in these fiscally challenging times, this is no longer an option. Making more efficient use of the existing infrastructure is the order of the day.

Cloud certainty
To have confidence that cloud services are available at all times, it is not enough to wait for issues to occur and rely on fallback solutions. The utilization and performance of critical links must be monitored proactively to assure cloud service availability.

This requires dedicated network performance monitoring appliances at all points in the delivery chain. These appliances are stand-alone hardware and software systems that are capable of capturing and analyzing all the data traffic on a link in real-time, at speeds up to 10 Gbps. Each data packet is analyzed to understand where it has come from, where it is going and the application that produced it.

With this information in hand, it is possible to see the utilization of critical links, as well as the applications and users that are hogging the bandwidth.

For cloud service providers, these network appliances can be used to monitor their communication with the outside world, but it also allows them to demand visibility into their connectivity providers' network to understand how their traffic is being transported.

The connectivity provider can use these solutions to support the Service Level Agreements (SLAs) they have with cloud service providers and ensure that there is available bandwidth. They can equally demand the same level of SLA from their other connectivity providers in the Internet domain. Thus the chain continues.

Such network performance tools are available today and being deployed in many enterprise, data center and communication networks. However, they need to be regarded as an essential part of the cloud service delivery infrastructure.

Cloud confidence
Availability is important for cloud services, but so is security. Cloud service providers provide a number of mechanisms to ensure that only the right persons gain access to critical data.

However, this is not the only threat. Malware, denial of service attacks and other malicious activity are becoming more prevalent. This requires dedicated network security solutions, such as firewalls and intrusion prevention systems that can provide a fence around critical access points. These are primarily at the enterprise and data center where the users and cloud service providers reside, but can also be in the connectivity provider's network securing critical links.

Again, these network security solutions are stand-alone hardware and software systems that are capable of analyzing high-speed data in real-time, taking action and then sending clean data traffic on its way. The process is completely transparent to the user and cloud service provider.

Using these systems ensures that the doors are firmly closed to would-be intruders and should be mandatory at all critical access points in the cloud service delivery chain.

Cloud clarity
Many pieces of the cloud service delivery chain are in place. What remains are the key components to assure service performance, availability and network security.

Network appliance solutions exist to address these areas and they now have the performance to keep up with even the highest speed networks thanks to advanced network adapters capable of handling data traffic at up to 10 Gbps in real-time without losing packets. What remain is to make these network appliances a mandatory component in the cloud service delivery infrastructure underpinning clear SLAs that can assure performance and security across the delivery chain.

So don't let the cloud rain on your parade! Ensure that all the pieces are in place and enjoy the benefits that the cloud can provide and the new opportunities it will enable.

More Stories By Daniel Joseph Barry

Daniel Joseph Barry is VP Positioning and Chief Evangelist at Napatech and has over 20 years experience in the IT and Telecom industry. Prior to joining Napatech in 2009, he was Marketing Director at TPACK, a leading supplier of transport chip solutions to the Telecom sector.

From 2001 to 2005, he was Director of Sales and Business Development at optical component vendor NKT Integration (now Ignis Photonyx) following various positions in product development, business development and product management at Ericsson. He joined Ericsson in 1995 from a position in the R&D department of Jutland Telecom (now TDC). He has an MBA and a BSc degree in Electronic Engineering from Trinity College Dublin.

Comments (1) View Comments

Share your thoughts on this story.

Add your comment
You must be signed in to add a comment. Sign-in | Register

In accordance with our Comment Policy, we encourage comments that are on topic, relevant and to-the-point. We will remove comments that include profanity, personal attacks, racial slurs, threats of violence, or other inappropriate material that violates our Terms and Conditions, and will block users who make repeated violations. We ask all readers to expect diversity of opinion and to treat one another with dignity and respect.


Most Recent Comments
Erik Sebesta 12/07/09 11:22:00 AM EST

You've summarized nicely why we went into business to become the leading cloud computing transition services company. :-)

Cloud Computing Services

Hopefully some of our analysis the on cloud computing leaders is helpful to you.

Cloud computing leaders

Cheers,

--Erik

@CloudExpo Stories
In addition to all the benefits, IoT is also bringing new kind of customer experience challenges - cars that unlock themselves, thermostats turning houses into saunas and baby video monitors broadcasting over the internet. This list can only increase because while IoT services should be intuitive and simple to use, the delivery ecosystem is a myriad of potential problems as IoT explodes complexity. So finding a performance issue is like finding the proverbial needle in the haystack.
The idea of comparing data in motion (at the sensor level) to data at rest (in a Big Data server warehouse) with predictive analytics in the cloud is very appealing to the industrial IoT sector. The problem Big Data vendors have, however, is access to that data in motion at the sensor location. In his session at @ThingsExpo, Scott Allen, CMO of FreeWave, discussed how as IoT is increasingly adopted by industrial markets, there is going to be an increased demand for sensor data from the outermos...
"Qosmos has launched L7Viewer, a network traffic analysis tool, so it analyzes all the traffic between the virtual machine and the data center and the virtual machine and the external world," stated Sebastien Synold, Product Line Manager at Qosmos, in this SYS-CON.tv interview at 19th Cloud Expo, held November 1-3, 2016, at the Santa Clara Convention Center in Santa Clara, CA.
Between 2005 and 2020, data volumes will grow by a factor of 300 – enough data to stack CDs from the earth to the moon 162 times. This has come to be known as the ‘big data’ phenomenon. Unfortunately, traditional approaches to handling, storing and analyzing data aren’t adequate at this scale: they’re too costly, slow and physically cumbersome to keep up. Fortunately, in response a new breed of technology has emerged that is cheaper, faster and more scalable. Yet, in meeting these new needs they...
When it comes to cloud computing, the ability to turn massive amounts of compute cores on and off on demand sounds attractive to IT staff, who need to manage peaks and valleys in user activity. With cloud bursting, the majority of the data can stay on premises while tapping into compute from public cloud providers, reducing risk and minimizing need to move large files. In his session at 18th Cloud Expo, Scott Jeschonek, Director of Product Management at Avere Systems, discussed the IT and busin...
Data is the fuel that drives the machine learning algorithmic engines and ultimately provides the business value. In his session at 20th Cloud Expo, Ed Featherston, director/senior enterprise architect at Collaborative Consulting, will discuss the key considerations around quality, volume, timeliness, and pedigree that must be dealt with in order to properly fuel that engine.
More and more companies are looking to microservices as an architectural pattern for breaking apart applications into more manageable pieces so that agile teams can deliver new features quicker and more effectively. What this pattern has done more than anything to date is spark organizational transformations, setting the foundation for future application development. In practice, however, there are a number of considerations to make that go beyond simply “build, ship, and run,” which changes how...
SYS-CON Events has announced today that Roger Strukhoff has been named conference chair of Cloud Expo and @ThingsExpo 2017 New York. The 20th Cloud Expo and 7th @ThingsExpo will take place on June 6-8, 2017, at the Javits Center in New York City, NY. "The Internet of Things brings trillions of dollars of opportunity to developers and enterprise IT, no matter how you measure it," stated Roger Strukhoff. "More importantly, it leverages the power of devices and the Internet to enable us all to im...
Whether your IoT service is connecting cars, homes, appliances, wearable, cameras or other devices, one question hangs in the balance – how do you actually make money from this service? The ability to turn your IoT service into profit requires the ability to create a monetization strategy that is flexible, scalable and working for you in real-time. It must be a transparent, smoothly implemented strategy that all stakeholders – from customers to the board – will be able to understand and comprehe...
"We are the public cloud providers. We are currently providing 50% of the resources they need for doing e-commerce business in China and we are hosting about 60% of mobile gaming in China," explained Yi Zheng, CPO and VP of Engineering at CDS Global Cloud, in this SYS-CON.tv interview at 19th Cloud Expo, held November 1-3, 2016, at the Santa Clara Convention Center in Santa Clara, CA.
"Once customers get a year into their IoT deployments, they start to realize that they may have been shortsighted in the ways they built out their deployment and the key thing I see a lot of people looking at is - how can I take equipment data, pull it back in an IoT solution and show it in a dashboard," stated Dave McCarthy, Director of Products at Bsquare Corporation, in this SYS-CON.tv interview at @ThingsExpo, held November 1-3, 2016, at the Santa Clara Convention Center in Santa Clara, CA.
In IT, we sometimes coin terms for things before we know exactly what they are and how they’ll be used. The resulting terms may capture a common set of aspirations and goals – as “cloud” did broadly for on-demand, self-service, and flexible computing. But such a term can also lump together diverse and even competing practices, technologies, and priorities to the point where important distinctions are glossed over and lost.
What happens when the different parts of a vehicle become smarter than the vehicle itself? As we move toward the era of smart everything, hundreds of entities in a vehicle that communicate with each other, the vehicle and external systems create a need for identity orchestration so that all entities work as a conglomerate. Much like an orchestra without a conductor, without the ability to secure, control, and connect the link between a vehicle’s head unit, devices, and systems and to manage the ...
All clouds are not equal. To succeed in a DevOps context, organizations should plan to develop/deploy apps across a choice of on-premise and public clouds simultaneously depending on the business needs. This is where the concept of the Lean Cloud comes in - resting on the idea that you often need to relocate your app modules over their life cycles for both innovation and operational efficiency in the cloud. In his session at @DevOpsSummit at19th Cloud Expo, Valentin (Val) Bercovici, CTO of Soli...
Amazon has gradually rolled out parts of its IoT offerings in the last year, but these are just the tip of the iceberg. In addition to optimizing their back-end AWS offerings, Amazon is laying the ground work to be a major force in IoT – especially in the connected home and office. Amazon is extending its reach by building on its dominant Cloud IoT platform, its Dash Button strategy, recently announced Replenishment Services, the Echo/Alexa voice recognition control platform, the 6-7 strategic...
Without a clear strategy for cost control and an architecture designed with cloud services in mind, costs and operational performance can quickly get out of control. To avoid multiple architectural redesigns requires extensive thought and planning. Boundary (now part of BMC) launched a new public-facing multi-tenant high resolution monitoring service on Amazon AWS two years ago, facing challenges and learning best practices in the early days of the new service. In his session at 19th Cloud Exp...
Everyone knows that truly innovative companies learn as they go along, pushing boundaries in response to market changes and demands. What's more of a mystery is how to balance innovation on a fresh platform built from scratch with the legacy tech stack, product suite and customers that continue to serve as the business' foundation. In his General Session at 19th Cloud Expo, Michael Chambliss, Head of Engineering at ReadyTalk, discussed why and how ReadyTalk diverted from healthy revenue and mor...
As data explodes in quantity, importance and from new sources, the need for managing and protecting data residing across physical, virtual, and cloud environments grow with it. Managing data includes protecting it, indexing and classifying it for true, long-term management, compliance and E-Discovery. Commvault can ensure this with a single pane of glass solution – whether in a private cloud, a Service Provider delivered public cloud or a hybrid cloud environment – across the heterogeneous enter...
You have great SaaS business app ideas. You want to turn your idea quickly into a functional and engaging proof of concept. You need to be able to modify it to meet customers' needs, and you need to deliver a complete and secure SaaS application. How could you achieve all the above and yet avoid unforeseen IT requirements that add unnecessary cost and complexity? You also want your app to be responsive in any device at any time. In his session at 19th Cloud Expo, Mark Allen, General Manager of...
Financial Technology has become a topic of intense interest throughout the cloud developer and enterprise IT communities. Accordingly, attendees at the upcoming 20th Cloud Expo at the Javits Center in New York, June 6-8, 2017, will find fresh new content in a new track called FinTech.