|By Dan Joe Barry||
|December 10, 2009 04:45 AM EST||
Cloud Computing on Ulitzer
We are standing on the threshold of a new transition in information technology and communications; a radical departure from current practice that promises to bring us new levels of efficiency at a vastly reduced cost. Cloud computing is full of potential, bursting with opportunity and within our grasp.
But, remember, that clouds always appear to be within our grasp and bursting clouds promise only one thing: rain!
As with all radical transitions, it takes time for the various pieces to fall into place. Some of them are already in place; some of them have yet to be considered. In this article, we will take a look at both and try to gauge where we are today and what work still remains. In addition, we will try to understand what this means to the various stakeholders involved.
So what is the cloud and who are the stakeholders involved. There are many definitions available, but in simple terms, cloud computing involves providing an information technology service that is accessed remotely. This access can be over a public or private infrastructure, but for our purposes, it is probably useful to consider the Internet as a reference delivery infrastructure.
With this in mind, a simple cloud model would include the following stakeholders:
- The cloud service provider
- The cloud connectivity provider
- The Internet
- The user connectivity provider
- The user
The cloud service provider is based in a data center (which we assume he controls for simplicity), where he has a number of servers running the cloud service being provided (e.g. a CRM system, a remote mail system, remote file repository, etc.). He is responsible for ensuring that the servers are up and running, are available at all times and that there are enough of them to service all the users who have subscribed to the service.
The cloud connectivity provider delivers Internet access connections to the cloud service provider and ensures that the cloud service provider has enough bandwidth for all of the users who wish to access the cloud service simultaneously. He must also ensure that these connections and the bandwidth requested are always available.
The user accesses the service remotely, typically through a web browser over the Internet. He also needs Internet access, which is provided by a connectivity provider (e.g. ISP), but only enough to ensure that he can access the service quickly and without too many delays. The connectivity provider ensures that his connection and required bandwidth is always available.
Which leaves us with the Internet. Who is responsible for this? The connectivity providers will typically have control over their parts of the network, but they must rely on other connectivity service providers to bridge the gap between them. The beauty of the Internet is that they do not have to know about all the actors in the chain of delivery. As long as they have a gateway to the Internet and the destination IP address, then the packets can be directed to the right user and vice versa.
The Internet itself is made up of a number of interconnected networks, often telecom service provider networks, who have implemented IP networks and can provide connectivity across the geographical region where they have licenses to provide services.
This brings the Internet and the cloud within the grasp of virtually everyone.
For cloud services to work, there are four fundamental requirements that need to be met:
- There must be an open, standard access mechanism to the remote service that can allow access from anywhere to anyone who is interested in the service
- This access must have enough bandwidth to ensure quality of experience (i.e. it should feel like the service or application is running on your desktop)
- This access must be secure so that sensitive data is protected
- This access must be available at ALL times
Some of these fundamentals are in place and are driving adoption of cloud services. The Internet and IP networking have grown to a point where it provides the perfect access mechanism. It is a global network, accessible from anywhere as Internet connectivity is now virtually ubiquitous. The bandwidth of the Internet is also not an issue - it is only a question of how much you are willing to pay for your connectivity.
Nevertheless, for users in particular, a modestly priced Internet connection provides all the bandwidth they need to access the cloud services they require.
So far so good!
Cloud service providers are extremely conscious of the fact that availability and security are key requirements and generally ensure that there are redundant servers, failover mechanisms and other solutions to ensure high availability. They also provide trusted security mechanisms to ensure that only the right people get access to sensitive data.
Still on track then!
That leaves the connectivity providers and the Internet itself. This is where more effort is needed.
IP networks and the Internet were designed for efficient transfer of data. The idea is that instead of establishing permanent connections like telephone call connections, where data will follow a pre-determined route every time, the data is routed through on a packet by packet basis on the best route available at the time, as determined by the network itself. There are a number of routes to the same destination, so even if one doesn't work, others will. What you can't guarantee is when data packets will get to the destination or in what order they will get there. If packets don't arrive as expected, then they are simply resent.
This works beautifully for data like web browsing, emails or file transfers, as it doesn't really matter when the data arrives as long as it gets there eventually.
But now, IP networks and the Internet are being used for all sorts of services like Voice-over-IP, Video-over-IP, Storage networks etc. For many of these services, time is critical and a guaranteed bandwidth is required. Many of these services are also sharing the same connections as normal data services, so there also has to be mechanisms to ensure that they are prioritized in relation to data services like those mentioned earlier.
The issue with this for cloud computing is that there is no mechanism for ensuring that a cloud service transported over the Internet will be given priority. In fact, it won't!
Cloud computing is in its nascent stages, but as the popularity of this approach grows, more and more people will access applications and services remotely leading to increased Internet traffic and congestion points as multitudes of users converge on a few critical cloud service provider points.
Up to now, this has not been an issue since there has been a healthy investment in networking capacity and congestion has been solved by "throwing bandwidth at the problem". But in these fiscally challenging times, this is no longer an option. Making more efficient use of the existing infrastructure is the order of the day.
To have confidence that cloud services are available at all times, it is not enough to wait for issues to occur and rely on fallback solutions. The utilization and performance of critical links must be monitored proactively to assure cloud service availability.
This requires dedicated network performance monitoring appliances at all points in the delivery chain. These appliances are stand-alone hardware and software systems that are capable of capturing and analyzing all the data traffic on a link in real-time, at speeds up to 10 Gbps. Each data packet is analyzed to understand where it has come from, where it is going and the application that produced it.
With this information in hand, it is possible to see the utilization of critical links, as well as the applications and users that are hogging the bandwidth.
For cloud service providers, these network appliances can be used to monitor their communication with the outside world, but it also allows them to demand visibility into their connectivity providers' network to understand how their traffic is being transported.
The connectivity provider can use these solutions to support the Service Level Agreements (SLAs) they have with cloud service providers and ensure that there is available bandwidth. They can equally demand the same level of SLA from their other connectivity providers in the Internet domain. Thus the chain continues.
Such network performance tools are available today and being deployed in many enterprise, data center and communication networks. However, they need to be regarded as an essential part of the cloud service delivery infrastructure.
Availability is important for cloud services, but so is security. Cloud service providers provide a number of mechanisms to ensure that only the right persons gain access to critical data.
However, this is not the only threat. Malware, denial of service attacks and other malicious activity are becoming more prevalent. This requires dedicated network security solutions, such as firewalls and intrusion prevention systems that can provide a fence around critical access points. These are primarily at the enterprise and data center where the users and cloud service providers reside, but can also be in the connectivity provider's network securing critical links.
Again, these network security solutions are stand-alone hardware and software systems that are capable of analyzing high-speed data in real-time, taking action and then sending clean data traffic on its way. The process is completely transparent to the user and cloud service provider.
Using these systems ensures that the doors are firmly closed to would-be intruders and should be mandatory at all critical access points in the cloud service delivery chain.
Many pieces of the cloud service delivery chain are in place. What remains are the key components to assure service performance, availability and network security.
Network appliance solutions exist to address these areas and they now have the performance to keep up with even the highest speed networks thanks to advanced network adapters capable of handling data traffic at up to 10 Gbps in real-time without losing packets. What remain is to make these network appliances a mandatory component in the cloud service delivery infrastructure underpinning clear SLAs that can assure performance and security across the delivery chain.
So don't let the cloud rain on your parade! Ensure that all the pieces are in place and enjoy the benefits that the cloud can provide and the new opportunities it will enable.
|Erik Sebesta 12/07/09 11:22:00 AM EST|
You've summarized nicely why we went into business to become the leading cloud computing transition services company. :-)
Hopefully some of our analysis the on cloud computing leaders is helpful to you.
The web app is agile. The REST API is agile. The testing and planning are agile. But alas, data infrastructures certainly are not. Once an application matures, changing the shape or indexing scheme of data often forces at best a top down planning exercise and at worst includes schema changes that force downtime. The time has come for a new approach that fundamentally advances the agility of distributed data infrastructures. Come learn about a new solution to the problems faced by software organ...
Oct. 9, 2015 02:00 AM EDT Reads: 875
Between the compelling mockups and specs produced by analysts, and resulting applications built by developers, there exists a gulf where projects fail, costs spiral, and applications disappoint. Methodologies like Agile attempt to address this with intensified communication, with partial success but many limitations. In his session at DevOps Summit, Charles Kendrick, CTO and Chief Architect at Isomorphic Software, will present a revolutionary model enabled by new technologies. Learn how busine...
Oct. 9, 2015 02:00 AM EDT Reads: 279
Interested in leveraging automation technologies and a cloud architecture to make developers more productive? Learn how PaaS can benefit your organization to help you streamline your application development, allow you to use existing infrastructure and improve operational efficiencies. Begin charting your path to PaaS with OpenShift Enterprise.
Oct. 9, 2015 02:00 AM EDT Reads: 552
The buzz continues for cloud, data analytics and the Internet of Things (IoT) and their collective impact across all industries. But a new conversation is emerging - how do companies use industry disruption and technology enablers to lead in markets undergoing change, uncertainty and ambiguity? Organizations of all sizes need to evolve and transform, often under massive pressure, as industry lines blur and merge and traditional business models are assaulted and turned upside down. In this new da...
Oct. 9, 2015 02:00 AM EDT Reads: 278
Internet of Things (IoT) will be a hybrid ecosystem of diverse devices and sensors collaborating with operational and enterprise systems to create the next big application. In their session at @ThingsExpo, Bramh Gupta, founder and CEO of robomq.io, and Fred Yatzeck, principal architect leading product development at robomq.io, discussed how choosing the right middleware and integration strategy from the get-go will enable IoT solution developers to adapt and grow with the industry, while at th...
Oct. 9, 2015 02:00 AM EDT Reads: 2,192
Overgrown applications have given way to modular applications, driven by the need to break larger problems into smaller problems. Similarly large monolithic development processes have been forced to be broken into smaller agile development cycles. Looking at trends in software development, microservices architectures meet the same demands. Additional benefits of microservices architectures are compartmentalization and a limited impact of service failure versus a complete software malfunction....
Oct. 9, 2015 02:00 AM EDT Reads: 211
Data loss happens, even in the cloud. In fact, if your company has adopted a cloud application in the past three years, data loss has probably happened, whether you know it or not. In his session at 17th Cloud Expo, Bryan Forrester, Senior Vice President of Sales at eFolder, will present how common and costly cloud application data loss is and what measures you can take to protect your organization from data loss.
Oct. 9, 2015 02:00 AM EDT Reads: 570
The Internet of Things (IoT) is growing rapidly by extending current technologies, products and networks. By 2020, Cisco estimates there will be 50 billion connected devices. Gartner has forecast revenues of over $300 billion, just to IoT suppliers. Now is the time to figure out how you’ll make money – not just create innovative products. With hundreds of new products and companies jumping into the IoT fray every month, there’s no shortage of innovation. Despite this, McKinsey/VisionMobile data...
Oct. 9, 2015 02:00 AM EDT Reads: 202
“In the past year we've seen a lot of stabilization of WebRTC. You can now use it in production with a far greater degree of certainty. A lot of the real developments in the past year have been in things like the data channel, which will enable a whole new type of application," explained Peter Dunkley, Technical Director at Acision, in this SYS-CON.tv interview at @ThingsExpo, held Nov 4–6, 2014, at the Santa Clara Convention Center in Santa Clara, CA.
Oct. 9, 2015 01:45 AM EDT Reads: 7,010
There are many considerations when moving applications from on-premise to cloud. It is critical to understand the benefits and also challenges of this migration. A successful migration will result in lower Total Cost of Ownership, yet offer the same or higher level of robustness. Migration to cloud shifts computing resources from your data center, which can yield significant advantages provided that the cloud vendor an offer enterprise-grade quality for your application.
Oct. 9, 2015 12:30 AM EDT Reads: 244
JFrog has announced a powerful technology for managing software packages from development into production. JFrog Artifactory 4 represents disruptive innovation in its groundbreaking ability to help development and DevOps teams deliver increasingly complex solutions on ever-shorter deadlines across multiple platforms JFrog Artifactory 4 establishes a new category – the Universal Artifact Repository – that reflects JFrog's unique commitment to enable faster software releases through the first pla...
Oct. 9, 2015 12:30 AM EDT Reads: 631
Through WebRTC, audio and video communications are being embedded more easily than ever into applications, helping carriers, enterprises and independent software vendors deliver greater functionality to their end users. With today’s business world increasingly focused on outcomes, users’ growing calls for ease of use, and businesses craving smarter, tighter integration, what’s the next step in delivering a richer, more immersive experience? That richer, more fully integrated experience comes ab...
Oct. 9, 2015 12:00 AM EDT Reads: 1,130
IT data is typically silo'd by the various tools in place. Unifying all the log, metric and event data in one analytics platform stops finger pointing and provides the end-to-end correlation. Logs, metrics and custom event data can be joined to tell the holistic story of your software and operations. For example, users can correlate code deploys to system performance to application error codes.
Oct. 9, 2015 12:00 AM EDT Reads: 219
As-a-service models offer huge opportunities, but also complicate security. It may seem that the easiest way to migrate to a new architectural model is to let others, experts in their field, do the work. This has given rise to many as-a-service models throughout the industry and across the entire technology stack, from software to infrastructure. While this has unlocked huge opportunities to accelerate the deployment of new capabilities or increase economic efficiencies within an organization, i...
Oct. 8, 2015 11:00 PM EDT Reads: 264
SYS-CON Events announced today that Dyn, the worldwide leader in Internet Performance, will exhibit at SYS-CON's 17th International Cloud Expo®, which will take place on November 3-5, 2015, at the Santa Clara Convention Center in Santa Clara, CA. Dyn is a cloud-based Internet Performance company. Dyn helps companies monitor, control, and optimize online infrastructure for an exceptional end-user experience. Through a world-class network and unrivaled, objective intelligence into Internet condit...
Oct. 8, 2015 10:00 PM EDT Reads: 592
The last decade was about virtual machines, but the next one is about containers. Containers enable a service to run on any host at any time. Traditional tools are starting to show cracks because they were not designed for this level of application portability. Now is the time to look at new ways to deploy and manage applications at scale. In his session at @DevOpsSummit, Brian “Redbeard” Harrington, a principal architect at CoreOS, will examine how CoreOS helps teams run in production. Attende...
Oct. 8, 2015 09:45 PM EDT Reads: 1,253
Containers are revolutionizing the way we deploy and maintain our infrastructures, but monitoring and troubleshooting in a containerized environment can still be painful and impractical. Understanding even basic resource usage is difficult - let alone tracking network connections or malicious activity. In his session at DevOps Summit, Gianluca Borello, Sr. Software Engineer at Sysdig, will cover the current state of the art for container monitoring and visibility, including pros / cons and li...
Oct. 8, 2015 09:30 PM EDT Reads: 213
There are so many tools and techniques for data analytics that even for a data scientist the choices, possible systems, and even the types of data can be daunting. In his session at @ThingsExpo, Chris Harrold, Global CTO for Big Data Solutions for EMC Corporation, will show how to perform a simple, but meaningful analysis of social sentiment data using freely available tools that take only minutes to download and install. Participants will get the download information, scripts, and complete en...
Oct. 8, 2015 09:15 PM EDT Reads: 285
Containers are changing the security landscape for software development and deployment. As with any security solutions, security approaches that work for developers, operations personnel and security professionals is a requirement. In his session at @DevOpsSummit, Kevin Gilpin, CTO and Co-Founder of Conjur, will discuss various security considerations for container-based infrastructure and related DevOps workflows.
Oct. 8, 2015 09:15 PM EDT Reads: 228
Manufacturing has widely adopted standardized and automated processes to create designs, build them, and maintain them through their life cycle. However, many modern manufacturing systems go beyond mechanized workflows to introduce empowered workers, flexible collaboration, and rapid iteration. Such behaviors also characterize open source software development and are at the heart of DevOps culture, processes, and tooling.
Oct. 8, 2015 09:00 PM EDT Reads: 1,093