|By Rod Fontecilla||
|April 13, 2009 10:45 AM EDT||
Cloud computing refers to the practice of leveraging third-party computing resources, such as network grids and server farms, to extend IT capabilities and reduce the cost of ownership. This practice offers numerous potential benefits to organizations that want to centralize software and data storage management while eliminating the costly overhead of in-house hardware and software maintenance and the personnel required to build, support, and maintain enterprise computing solutions.
Cloud computing has emerged as a new computing paradigm that gathers massive numbers of computers in centralized data centers to deliver Web-based applications, application platforms, and services via a utility model. The primary difference between the service models of cloud computing and previous software (e.g., outsourcing or data center consolidation) is scale. The premise is that as the scale of the cloud infrastructure increases, the incremental time and cost of application delivery trends toward zero.
Cloud computing allows users to dynamically and remotely control processing, memory, data storage, network bandwidth, and specialized business services from pools of resources, providing the ability to specify and deploy computing capacity on-demand. If there's a need to scale up to accommodate sudden demand, users can add the necessary resources using a Web browser. The large data center can provide similar services to multiple external customers (multi-tenancy), leveraging its shared resources to increase economies of scale and reducing service costs.
Although cloud computing is in its early stages and definitions vary greatly, the underlying technologies today are consistent. These technologies include the following:
- Grid computing: A form of distributed parallel computing whereby processes are split up to leverage the available computing power of multiple CPUs acting in concert.
- Utility computing: A model of purchasing computing capacity, such as CPU, storage, and bandwidth, from an IT service provider, billed based on consumption.
- Virtualization technologies: Virtual servers and virtual private networks provide the ability to quickly reconfigure available resources on-demand and provide the necessary security assurance.
There are a number of service offerings and implementation models under the cloud computing umbrella, each with associated pros and cons. These models can be grouped into the following three categories: Infrastructure-as-a-Service (IaaS), Platform-as-a-Service (PaaS), and Software-as-a-Service (SaaS). These models target varying levels of services, ranging from general infrastructure services, such as operating systems or database services provided by IaaS vendors to targeted functional services provided by SaaS vendors (e.g., customer relationship management from Salesforce.com).
The various players in the current market can be differentiated into the following two categories:
- Cloud Providers: Offer one or more of the cloud models (i.e., IaaS, PaaS, or SaaS) as a service. Examples include Amazon and Google.
- Cloud Enablers: Provide technology or have adapted existing technology to run on or support cloud computing. A recent example is Oracle's partnership with Amazon to add Oracle 11g database support (technology and licensing) to Amazon's existing EC2 services offering.
We recognize that the transition to a cloud computing paradigm presents a number of challenges. Issues associated with information security, reliability, and service level agreements challenge mission-critical systems. Furthermore, we've identified what we consider the key characteristics of a cloud computing environment:
- Minimized capital expenditure - infrastructure is provider-owned
- Device and location independence
- Multi-tenancy - enables resource and cost sharing among a large pool of users
- Monitored and consistent performance - can be affected by high network load
- Reliability via redundant sites - allows for business continuity and disaster recovery
- Scalability to ever-changing user demands - results in lower costs
- Improved security from centralized data and increased security-focused resources
My experience has emphasized the importance of "architecting for the cloud" versus simply deploying system components to the cloud to ensure that business requirements are met. Typical software and systems that are not designed to take advantage of the scalability and parallelism of the cloud will likely not achieve the full benefit that is provided by a cloud computing environment. It's also highlighted the need to transition the role of IT managers to brokers and negotiators of IT services rather than the day-to-day management of the operating platform.
My analysis of the benefits and challenges presented by the cloud computing paradigm has resulted in the identification of the following three cloud variations:
- Commercial Cloud: It consists of deployment to one or more of the commercial cloud providers (e.g., Amazon or Google). It could be a simple integration with an existing SaaS service to support a subset of application functionality or could consist of a complete migration to the cloud. This may be appropriate for non-mission critical systems (e.g., < 99.99% availability) that do not process sensitive data or where sensitive data won't traverse system boundaries to the cloud.
- On-Premises (Private) Cloud: An on-premises cloud could be created to provide some of the benefits of cloud computing. Booz Allen selected a similar option in our implementation for the FBI to address the security concerns associated with a classified environment; however, the multi-tenancy aspect is then limited to a single agency. Consequently, this option doesn't provide the massive scalability that's characteristic of a true cloud.
- Government Cloud: The creation of one or more government cloud computing environments. These environments would be designed specifically to address the concerns that are unique to the government. For civilian agencies, this cloud could be an extension of the current eGovernment lines of business (LoBs).
Though many cloud providers proclaim that moving existing applications to the cloud is seamless and doesn't require code changes, my experience has shown that greater analysis and re-engineering are required to achieve the full benefits of a cloud computing environment. Complexities remain that organizations must consider when moving to the cloud, and careful planning is essential.
Based on the lessons learned from previous efforts, a phased Cloud Computing Transition Methodology designed to address the issues and risks associated with migrating an existing system to the cloud is developed. Figure 1 provides an overview of this approach.
The Cloud Strategy and Planning phase (Phase 1) consists of three steps designed to ensure that all aspects of moving to a cloud environment have been appropriately evaluated and agreed upon. The three steps are:
1. Conduct a Strategic Diagnostic
The objective of the strategic diagnostic is to identify the major factors influencing the decision to move to the cloud environment and determine the best approach. During the diagnostic step, we will validate the key objectives of moving to the cloud and the "pain points" that the organization wants to address. The drivers during this step include reducing day-to-day risk, reducing the level of involvement in day-to-day IT management, eliminating overhead, achieving better productivity, reducing or eliminating the cost of adding additional users, and protecting the information system from misuse and unauthorized disclosure.
The primary areas to be addressed during the diagnostic step are security and privacy, technical, business and customer impact, economics, and governance and policy. We will evaluate the implications of moving to the cloud environment in each of these categories and document the key issues and considerations revealed during the diagnostic step. The outcome of this diagnostic will be sufficient analysis to support a "go/no go" decision to move to a cloud computing environment and the development of an agreed-on cloud strategy.
2. Define a Cloud Strategy
To define a cloud strategy, the organization should document a complete understanding of each component of its existing architecture. The analysis examines the required user services, processing services, information security, application software standards, and integrated software down to each component. This can be achieved by leveraging existing architectural documents and ensuring the appropriate level of detail is documented, including system-to-system interfaces, data storage, forms processing and reporting, distributed architecture, access control (authentication and authorization), and security and user provisioning.
3. Create an Implementation Plan
The implementation plan identifies the roles and responsibilities, operating model, major milestones, Work Breakdown Structure (WBS), risk plan, dependencies, and quality control mechanisms to implement the cloud strategy successfully.
After completing Phase 1, the organization will have fully analyzed its options, identified all requirements, thoroughly assessed short-term and long-term costs and benefits, gained executive governance approval, and socialized the solution with stakeholders (including oversight entities). This phase ensures that the organization will have a high degree of confidence in successfully moving to the cloud environment, reap the expected benefits, not constrain future functionality, and avoid hidden future costs.
The Cloud Deployment phase (Phase 2) focuses on implementing the strategy developed in the planning phase. Leveraging the various cloud models helps identify the most effective solution(s) based on the existing organization architecture. Some of the criteria used in recommending a vendor are the vendor's primary service model (i.e., infrastructure, platform, or software), business model, how much existing technology can they leverage, end-user experience, and risks involved in porting to the cloud. Deploying to the cloud involves taking the decision analysis from Phase 1 as input and proceeding with the following four steps:
1. Assess/Select the Cloud Provider(s)
The assessment step deals with analyzing the components of the architecture and identifying the optimal vendor offerings. One of the main criteria in selecting a provider is its ability to leverage existing technologies. For example, current Oracle customers can use their software licenses on Amazon's EC2 cloud, which allows for reusing existing technologies, whereas current databases can simply be moved to the cloud. In addition, Oracle lets customers deploy on the Amazon cloud through Amazon Machine Images (AMI). This way, a new virtual machine is ready for use with the Oracle database loaded in a matter of minutes.
The assessment step captures three important inputs: the current organization technical architecture, objectives, and the vendor selection criteria that are tailored to meet the organizational objectives. The vendor assessment results in recommendations on the most appropriate cloud vendor, assists in selecting the most effective cloud model, develops deployment strategies, highlights reusable components, and identifies the security options for the cloud architecture.
2. Establish Service Level Agreements (SLAs)
Unlike traditional computing models, where most, if not all, of the components are on-premises and there's direct control over services, cloud computing involves handing off the system to a third-party vendor or vendors. In this case, SLAs address concerns like performance, downtime, provisioning, security, backup, and recovery, and ensure that objectives and established benchmarks are being met. SLAs formalize the contractual agreement between the organization and the selected vendor(s) and will highlight the offerings of the vendor(s), so the expectations on both sides are clear.
A typical SLA will identify service levels for the following:
- Retention Time: During an emergency/outage, how long would it take for the organization to sustain its operations
- Uptime: The percent of the time that the system will be available (e.g., 99.9%) and the period over which the measurement is taken
- Performance and throughput
- Security and Data protection: Where is the data stored? What precautions are taken by the vendor to ensure the data isn't tampered with?
- The level of support offered (e.g., 24/7)
- Service credits if the SLA isn't met
- The SLA can also address specific concerns like the guarantee of data protection and privacy when other foreign entities are hosted
3. Execute Transition
The execution step involves the actual transition of components identified in earlier steps. Based on the number and type of components that are being ported to the cloud, execution can be an iterative process. One of the primary steps in execution is to establish multiple environments, such as development, testing, production, and training. The preliminary questionnaire to set up an environment can include items like the number of instances required, memory, storage space, and basic software that needs to be installed.
4. O&M and Help Desk
The level of O&M and help desk provided by the cloud vendor may be driven by the selected cloud model(s) and will be determined by the SLAs previously established. If an IaaS is chosen where the vendor just provides the hardware resources and the organization installs the software components and deploys the applications, the maintenance provided by the vendor will be limited. Different vendors provide different support models and at different levels. It's important to ensure the kind of essential support functions that will be provided by vendors for successful continuity of operations (COOP) and to understand how operations will be restored on time at the backup site.
Deploying cloud computing solutions requires both a short-term and a long-term strategy. For example, besides the improved scalability and reliability provided by the cloud, which organizations may achieve through the initial transition, re-engineering some components to take advantage of the parallelism provided by the cloud could improve system performance and overall scalability further. Transitioning an existing system to the cloud requires an approach that addresses not only the technical aspects of cloud computing but also considers the objectives of the organization, the constraints imposed by the existing system, and the impact to its existing customers. As an experienced cloud computing strategist, I believe that corporations and even the government sector are prepared to consider the complexities of cloud computing technologies.
Take the Long View with Digital Transformation By @IoT2040 | @ThingsExpo #IoT #M2M #API #Microservices #InternetOfThings
Digital Transformation is the ultimate goal of cloud computing and related initiatives. The phrase is certainly not a precise one, and as subject to hand-waving and distortion as any high-falutin' terminology in the world of information technology. Yet it is an excellent choice of words to describe what enterprise IT—and by extension, organizations in general—should be working to achieve. Digital Transformation means: handling all the data types being found and created in the organizat...
Aug. 2, 2015 06:00 PM EDT Reads: 1,115
SYS-CON Events announced today that HPM Networks will exhibit at the 17th International Cloud Expo®, which will take place on November 3–5, 2015, at the Santa Clara Convention Center in Santa Clara, CA. For 20 years, HPM Networks has been integrating technology solutions that solve complex business challenges. HPM Networks has designed solutions for both SMB and enterprise customers throughout the San Francisco Bay Area.
Aug. 2, 2015 05:45 PM EDT Reads: 500
Chuck Piluso presented a study of cloud adoption trends and the power and flexibility of IBM Power and Pureflex cloud solutions. Prior to Secure Infrastructure and Services, Mr. Piluso founded North American Telecommunication Corporation, a facilities-based Competitive Local Exchange Carrier licensed by the Public Service Commission in 10 states, serving as the company's chairman and president from 1997 to 2000. Between 1990 and 1997, Mr. Piluso served as chairman & founder of International Te...
Aug. 2, 2015 04:00 PM EDT Reads: 413
The Software Defined Data Center (SDDC), which enables organizations to seamlessly run in a hybrid cloud model (public + private cloud), is here to stay. IDC estimates that the software-defined networking market will be valued at $3.7 billion by 2016. Security is a key component and benefit of the SDDC, and offers an opportunity to build security 'from the ground up' and weave it into the environment from day one. In his session at 16th Cloud Expo, Reuven Harrison, CTO and Co-Founder of Tufin,...
Aug. 2, 2015 03:00 PM EDT Reads: 535
Container technology is sending shock waves through the world of cloud computing. Heralded as the 'next big thing,' containers provide software owners a consistent way to package their software and dependencies while infrastructure operators benefit from a standard way to deploy and run them. Containers present new challenges for tracking usage due to their dynamic nature. They can also be deployed to bare metal, virtual machines and various cloud platforms. How do software owners track the usag...
Aug. 2, 2015 02:00 PM EDT Reads: 237
SYS-CON Events announced today that MobiDev, a software development company, will exhibit at the 17th International Cloud Expo®, which will take place November 3–5, 2015, at the Santa Clara Convention Center in Santa Clara, CA. MobiDev is a software development company with representative offices in Atlanta (US), Sheffield (UK) and Würzburg (Germany); and development centers in Ukraine. Since 2009 it has grown from a small group of passionate engineers and business managers to a full-scale mobi...
Aug. 2, 2015 12:00 PM EDT Reads: 342
With SaaS use rampant across organizations, how can IT departments track company data and maintain security? More and more departments are commissioning their own solutions and bypassing IT. A cloud environment is amorphous and powerful, allowing you to set up solutions for all of your user needs: document sharing and collaboration, mobile access, e-mail, even industry-specific applications. In his session at 16th Cloud Expo, Shawn Mills, President and a founder of Green House Data, discussed h...
Aug. 2, 2015 11:45 AM EDT Reads: 480
There are many considerations when moving applications from on-premise to cloud. It is critical to understand the benefits and also challenges of this migration. A successful migration will result in lower Total Cost of Ownership, yet offer the same or higher level of robustness. In his session at 15th Cloud Expo, Michael Meiner, an Engineering Director at Oracle, Corporation, analyzed a range of cloud offerings (IaaS, PaaS, SaaS) and discussed the benefits/challenges of migrating to each offe...
Aug. 2, 2015 11:00 AM EDT Reads: 170
One of the hottest areas in cloud right now is DRaaS and related offerings. In his session at 16th Cloud Expo, Dale Levesque, Disaster Recovery Product Manager with Windstream's Cloud and Data Center Marketing team, will discuss the benefits of the cloud model, which far outweigh the traditional approach, and how enterprises need to ensure that their needs are properly being met.
Aug. 2, 2015 09:00 AM EDT Reads: 1,697
In their session at 17th Cloud Expo, Hal Schwartz, CEO of Secure Infrastructure & Services (SIAS), and Chuck Paolillo, CTO of Secure Infrastructure & Services (SIAS), provide a study of cloud adoption trends and the power and flexibility of IBM Power and Pureflex cloud solutions. In his role as CEO of Secure Infrastructure & Services (SIAS), Hal Schwartz provides leadership and direction for the company.
Aug. 2, 2015 08:15 AM EDT Reads: 181
"We've just seen a huge influx of new partners coming into our ecosystem, and partners building unique offerings on top of our API set," explained Seth Bostock, Chief Executive Officer at IndependenceIT, in this SYS-CON.tv interview at 16th Cloud Expo, held June 9-11, 2015, at the Javits Center in New York City.
Aug. 1, 2015 09:00 PM EDT Reads: 671
Mobile, social, Big Data, and cloud have fundamentally changed the way we live. “Anytime, anywhere” access to data and information is no longer a luxury; it’s a requirement, in both our personal and professional lives. For IT organizations, this means pressure has never been greater to deliver meaningful services to the business and customers.
Aug. 1, 2015 11:15 AM EDT Reads: 200
The Internet of Everything (IoE) brings together people, process, data and things to make networked connections more relevant and valuable than ever before – transforming information into knowledge and knowledge into wisdom. IoE creates new capabilities, richer experiences, and unprecedented opportunities to improve business and government operations, decision making and mission support capabilities.
Aug. 1, 2015 10:00 AM EDT Reads: 335
[slides] A New Architecture for the Internet of Things By @JKirklan | @ThingsExpo @RedHatNews #IoT #M2M #InternetOfThings
Explosive growth in connected devices. Enormous amounts of data for collection and analysis. Critical use of data for split-second decision making and actionable information. All three are factors in making the Internet of Things a reality. Yet, any one factor would have an IT organization pondering its infrastructure strategy. How should your organization enhance its IT framework to enable an Internet of Things implementation? In his session at @ThingsExpo, James Kirkland, Red Hat's Chief Arch...
Jul. 30, 2015 07:30 PM EDT Reads: 1,420
Malicious agents are moving faster than the speed of business. Even more worrisome, most companies are relying on legacy approaches to security that are no longer capable of meeting current threats. In the modern cloud, threat diversity is rapidly expanding, necessitating more sophisticated security protocols than those used in the past or in desktop environments. Yet companies are falling for cloud security myths that were truths at one time but have evolved out of existence.
Jul. 30, 2015 06:00 PM EDT Reads: 1,825
[slides] Workloads and Public Cloud at @CloudExpo By @utollwi | @ProfitBricksUSA #DevOps #Containers #Microservices
Public Cloud IaaS started its life in the developer and startup communities and has grown rapidly to a $20B+ industry, but it still pales in comparison to how much is spent worldwide on IT: $3.6 trillion. In fact, there are 8.6 million data centers worldwide, the reality is many small and medium sized business have server closets and colocation footprints filled with servers and storage gear. While on-premise environment virtualization may have peaked at 75%, the Public Cloud has lagged in adop...
Jul. 30, 2015 04:00 PM EDT Reads: 2,232
The time is ripe for high speed resilient software defined storage solutions with unlimited scalability. ISS has been working with the leading open source projects and developed a commercial high performance solution that is able to grow forever without performance limitations. In his session at Cloud Expo, Alex Gorbachev, President of Intelligent Systems Services Inc., shared foundation principles of Ceph architecture, as well as the design to deliver this storage to traditional SAN storage co...
Jul. 30, 2015 03:00 PM EDT Reads: 1,764
MuleSoft has announced the findings of its 2015 Connectivity Benchmark Report on the adoption and business impact of APIs. The findings suggest traditional businesses are quickly evolving into "composable enterprises" built out of hundreds of connected software services, applications and devices. Most are embracing the Internet of Things (IoT) and microservices technologies like Docker. A majority are integrating wearables, like smart watches, and more than half plan to generate revenue with ...
Jul. 30, 2015 02:30 PM EDT Reads: 138
The Cloud industry has moved from being more than just being able to provide infrastructure and management services on the Cloud. Enter a new era of Cloud computing where monetization’s services through the Cloud are an essential piece of strategy to feed your organizations bottom-line, your revenue and Profitability. In their session at 16th Cloud Expo, Ermanno Bonifazi, CEO & Founder of Solgenia, and Ian Khan, Global Strategic Positioning & Brand Manager at Solgenia, discussed how to easily o...
Jul. 30, 2015 01:45 PM EDT Reads: 404
The speed of software changes in growing and large scale rapid-paced DevOps environments presents a challenge for continuous testing. Many organizations struggle to get this right. Practices that work for small scale continuous testing may not be sufficient as the requirements grow. In his session at DevOps Summit, Marc Hornbeek, Sr. Solutions Architect of DevOps continuous test solutions at Spirent Communications, explained the best practices of continuous testing at high scale, which is rele...
Jul. 30, 2015 12:00 PM EDT Reads: 1,421