|By Jon Shende||
|August 31, 2010 07:45 AM EDT||
[Adopted from my BLOG December 2009]
Lately in the IT community all the hype is on Cloud Computing. We have small start-ups offering several variations of Cloud services as well as some of the established big players (Google, Amazon, IBM, Novell (aimed at cloud service providers),Sun) stepping up their offerings of cloud services.
But what exactly is Cloud Computing? Is it Virtualization? Is it services that we accessed via a web browser over the years, something totally new, or is it all of these,but just rebranded?
The term Cloud Computing started gaining traction when Google and IBM launched a university initiative to address internet scale computing back in 2007.
These services has been evolving since the 90s and its previous incarnations can be said to be Grid and Utility computing and the Software as a Service offerings we saw around a decade ago.
Cloud Computing Journal - the Web's most widely Web resource on Cloud Computing
In a nutshell we can draw an analogy which can be stated as this: think of a utility service you use, say for example electricity. You get your meter read every few weeks and you receive a bill for energy consumed between readings.
The same underlying premise can be applied to a cloud service, an end user can subscribe for any of the offered cloud services and based on service usage from the provider be billed for consumption of that particular service or series of services for its specified time-frame.
Once can safely state that Cloud Computing as an on-demand, self-service, pay-as you go utility, evolved from a combination of grid computing, virtualization, and automation.
Experts estimate that this industry will grow to a 42 billion dollar industry by 2012, however the implementation and usage of cloud computing models and services is not without issues.
Most business managers will most likely consider the Capex and Opex aspects, especially in this economy. How much money an IT department can save yet still maintain operational efficiencies and security is a primary focus; by implementing one or more cloud computing services, an enterprise can obtain the scale and flexibility it needs and potentially save time as well with the concepts of dynamic provisioning of needed services.
One Cloud Computing claim is to lower costs, increase business agility and help increase the velocity at which applications can be deployed, however a good question to consider is can one expect its implementation to be disruptive and to what length?
In order to engage cloud computing services, business models will have to be adjusted or downright changed, in order to effectively and efficiently managing the utility aspect of computing power used in everyday operations and the manner in which management will be able to utilize resources.
As with any implementation, standards and regulation needs to be formulated and implemented in order to ensure that both vendor and the tenant are in compliance and within governance of an agreed format of policies.
As of now there are no formal standards directed solely toward cloud computing however NIST has proposed a potential framework standard called Standards Acceleration to Jumpstart Adoption of Cloud Computing (SAJAAC). With this, every effort should be made to ensure the confidentiality, availability and integrity of data held within a cloud computing environment going forward.
The National Institute for Science and Technology (NIST) defines cloud computing as "a pay-per-use model for enabling available, convenient, on-demand network access to a shared pool of configurable computing resources (e.g., networks, servers, storage, applications, services) that can be rapidly provisioned and released with minimal management effort or service provider interaction." 
Cloud Computing Models: Cloud models can be one of the following three:
- Infrastructure as a Service (IaaS) e.g. Tier 3, Amazon EC2,while the subscriber does not control the cloud infrastructure they do have control over select portions of network e.g. firewalls, operating system, deployed applications and storage.
- Platform as a Service (PaaS) this goes back to the early 70's when it was referred to as Framework as a Service. What is does is simply to provide different combinations of services to a subscriber supporting an application development life-cycle e.g. Google's App Engine which will let a subscriber run web applications on Google's infrastructure or Azure. In essence the subscriber will use programming(.Net Java python) and tools supplied by the service provider with no underlying responsibility for the cloud deployed network, severs, operating system and storage etc.
- Software as a Service (SaaS) e.g. Facebook, Salesforce.com, applications running on a cloud infrastructure that can be accessed via a web browser interface.
Bear in mind that there can be dependencies and a relationship between the models as Infrastructure as a Service can be stated as the foundation of Cloud Computing services, upon which Platform as a Service and then Software as a Service is built upon.
these services can be implemented by the end user in four different manners :
1) Private Cloud aka a corporate cloud refers to proprietary computing architecture providing hosted services to a limited number of people which resides behind a corporate firewall, in other words a single tenant.
By using private clouds enterprises/tenants will receive the same economies of scale and bi-directional scaling as that of the public cloud user.
However being a single enterprise or division within that enterprise will ensure the additional benefit of more control and security for data held within the private cloud, as on-premise data centers can be converted into private clouds by implementing virtualization technologies from companies such as Microsoft, Citrix, VMware, Novell and Sun.
2) Public Cloud is the cloud that is provided for lease external to an entity's physical location e.g. Amazon's EC2.
This deployment facilitates rapid scaling via virtualization technologies (which enables cloud user resources an ability to rapidly start up and shut down,) and can be utilized by multiple tenants however; within this deployment users have no access to dedicated resources.
This results in users giving up a certain amount of control over the process, which in turn can raise security and compliance issues.
3) Hybrid Cloud is a mixture of the public and private. This can be appealing for a company that chose to store non confidential data externally say using Simple Storage Service (S3) whilst keeping private data in-house.
4) Managed Cloud -In this offering the physical infrastructure in operation is owned by the subscriber and can be housed within the physical premises of the subscriber.
However the service provider will control portions of management and security of the service utility.
Some Deployment Concerns
As with the deployment of any IT system there will be challenges and cause for concern. Certain scenarios will have to be anticipated and use cases as well as processes to mitigate these concerns need to be clarified. Some examples of concerns are as follows:
1) Software licenses: software is typically subscribed as those that are proprietary licensed or those that are free and open source licensed. Software licenses govern usage or redistribution of software which are in most cases copyright protected.
Something to consider is how software that you want to deploy into a cloud is licensed.
Is the software you want to deploy licensed on a per server basis or not and how will easy or difficult will deploying your software of choice into a cloud be?
Will proprietary software solutions need to be confined to dedicated hosting environments? Most likely yes, at least for now or until you can get a vendor who can securely provide the software you need on a pay as you go basis.
Because of how software licensing is structured early cloud users have been found to use more open source software.
2) Single point of failure: a mission critical application is deployed via a single vendor; issues at the vendor's site may severely impact the availability of resources for the tenant.
The vendor may claim to have multiple, remote backup locations completely powered However when it comes to ensuring that, in addition, redundant cloud administration and infrastructure software are in place, the vendor may fall short.
3) Portability: a cause for concern is that each vendor may utilize different applications APIs and formats for data. This in turn may limit application and data portability to other environments, as they are likely to be using proprietary APIs thus causing a "Lock In" situation where it will be easy to sign a contract and use a vendor service however transferring out to another vendor could have major issues.
Of course as the cloud computing environment evolves this may soon be remedied.
4) Security: the elephant in the cloud room. The most common fear with using a cloud deployment is a loss of control and security of data.
Granted this is still a system built on hard and software platforms and as such is still susceptible to the traditional security attacks (DOS, DDOS etc.), conversely a point for consideration should be that any security measure will be more cost effective when implemented on a larger scale.
Any good IT manager has voiced concerns over whether employees/administrators at the cloud provider can be trusted to not look at data or even modify it or, whether other customers sharing the cloud can hack data or access it without leaving an audit trail.
From this a tenant can ask about methods the vendor is employing to protect data such as high physical security as well as what types of monitoring, intrusion detection and firewall equipment are in place at their centers.
Even worse is whether competitors could find out sensitive information such as customer orders, pricing and cost information, and negatively impact business. And of course what about privacy concerns and government regulations?
Other issues of concern can be:
- What levels of protection in place to protect one customer from accessing another customer's data or application within a shared cloud space?
- Who will be liable for security breaches and how will the law regarding this in any one jurisdiction ensure compliance?
- How well will a vendor system integrate with a tenant's security systems?
5) Scalability: Every user/potential user of the cloud constantly hear of the substantial savings they will realize by utilizing cloud-based resources.
In order to take full advantage of the scalability of the cloud there should be a means of ensuring that there is some form of dynamic measurement and resource management for applications held within a cloud.
Scalability within the cloud can be had by composing the service from other scalable services as can be seen with Google App Engine.
6) Auditing: With the cloud one has to consider how compliance with ISO standards, Sarbanes-Oxley, HIPAA ,PCI-DSS etc. will impact certain data from being deployed. This more so when considering the attractiveness of data to unauthorized entities and the methods they could use to gain access to that data.
Any IT manager will also tell you that without proper planning the cost of an audit can be higher than expected.
As of this writing, I am not aware of any formulated standards for auditing within the cloud, however I must state that for a business, auditing within the cloud may be an attractive option as, this can be done live with no down time or interruption to business processes.
7) Compliance: There are no standards in place as of yet, but the National Institute of Standards and Technology (NIST )and others are working toward that end.
8) Other Data Access: what happens to my data if the vendor revokes my access or there is a system malfunction? This is a common question that anyone thinking of using the cloud should ask. Remember the data loss for T-mobile customers using the services of Microsoft subsidiary - Danger? "Microsoft said any data that users had on their devices and is no longer there has almost certainly been permanently lost" Here there was no revocation of access but an alleged system glitch.
A personal example occurred a few days ago when I tried to access a Gmail account I kept just for research and online backup.
The system message intimated that I had violated the "Terms of Agreement". What?! The Gmail account was hardly ever used to send email and the Google docs account was used as a second online backup for some of my documents and files. If this was not a secondary backup or not a backup at all, I would have lost access all my uploaded documents and files, with no recourse for resolution but filling out a form and hoping for contact from the support center.
In order to address and mitigate these issues the tenant should ensure that workarounds and backup plans are worked into their Service Level Agreements (SLAs) with the vendor.
Service Level Agreements (SLAs)
A service level agreement a part of a service contract where the level of service is formally defined. In practice, the term is sometimes used to refer to the contracted delivery time (of the service) or performance.  Whilst there may not be much flexibility with a vendor in defining an SLA, I am confident that the laws of supply and demand will shift this more toward the tenant in the near future.
Cloud computing vendors are getting into this business to affect their bottom line and shareholder value if publicly traded. At the end of the day their focus will be on making a profit on services offered.
In light of this most tenants may feel as though they are getting into an arrangement where it appears as though vendors create the SLAs for their own protection against litigation, with minimal assurances to a tenant.
That being said, this does not mean that an IT manager cannot make the SLA work as a tool to chose an appropriate service provider. An IT manager's main concern will be the security of data and of course, the traditional interpretation of the CIA triad (Confidentiality, Integrity and Availability) may not be applicable within their cloud service.
To start an IT manager can focus on the following when hammering out their SLA with a vendor:
1) Data Protection: where there is a clear definition as to who will have access to the data and the levels of protection in effect for their data.
Some questions that can be asked are:
- How will data be encrypted?
- How will compliance be addressed?
- What are the levels of access control?
- Will there be sub-contractors or third party providers processing the data?
- Where are backups stored?
- How is the data center secured?
- What happens to the data if service providers are switched?
- What processes are in place to mitigate legal inquiries about a customer's data?
- How often are audits done and what types of auditing tools are in place?
- What happens to my data if there is an investigation taking place on another tenant sharing services and how will you ensure my access to my data in the event of equipment seizure by federal entities?
- How is data deletion handled?
2) Continuity: one has to consider what happens in the event of an outage or another related event that causes data to become unavailable.
Some questions to consider here are
- How will the vendor define a services outage?
- Will there be scheduled vendor downtime for maintenance etc.?
- Will there be an alternative vendor hot site or vendor site prepped to take on load of access in the event of a vendor outage?
- Are there tools in motion which will determine the severity of a vendor outage?
- How will the tenant be compensated in the event of a vendor an outage?
- Define levels of redundancy in place to minimize vendor outages?
3) Costs: on cost to consider are:
- How is the vendor's fee structured and is taxes and external fees accounted for in a vendor quote?
- Will there be or are there current licensing fees above and beyond stated vendor service fees?
- Will there be any hidden or add on costs for vendor support?
- How does the vendor structure their charges? Is it based upon usage, traffic or storage limit
- Does the vendor offer price protection?
It is expected that Cloud Computing will the wave of the future in terms of computing, it is only logical that the cloud's economies of scale and flexibility will impact how technology evolves and how users of technologies implement these technologies.
However in terms of security the massive availability of resources and data within a cloud does present a very attractive target for attackers.
That being said, we can assume that cloud-based defences may be more robust, scalable and cost-effective, in an effort to mitigate security concerns regarding multiple tenants, encryption, trust and compliance.
Part of a cloud service is the API. However when it comes to integration between vendors this may pose a problem for tenants, given the fact that cloud APIs are not yet standardized. This means that each vendor has a specific APIs for managing its services that will lock customers to their vendors due to vendor proprietary technology.
The work around here would be to look for vendors that use standard APIs wherever possible. This is a viable option as standard APIs are already implemented for access to storage as well as deploying and scaling applications.
In terms of auditing and forensics, dedicated, pay-per-use forensic images of virtual machines can be obtained by an auditor without having to take infrastructure offline. This of course results in less down-time for auditing as well as it can provide cost-effective storage for logs without deterring system performance.
All of which will increase the return on investment as well as decrease operational costs normally involved with in house systems processing the same data as in the cloud.
Of course Cloud Computing is still in its infancy and whilst some proposals may look good in theory, only time will tell how we proceed and evolve with this system of computing.
 Cloud Connect
 Cloud Security Alliance
Cloud Computing journal
European Network and Information Security Agency.
The Internet of Things will challenge the status quo of how IT and development organizations operate. Or will it? Certainly the fog layer of IoT requires special insights about data ontology, security and transactional integrity. But the developmental challenges are the same: People, Process and Platform. In his session at @ThingsExpo, Craig Sproule, CEO of Metavine, demonstrated how to move beyond today's coding paradigm and shared the must-have mindsets for removing complexity from the develo...
Jul. 31, 2016 03:15 AM EDT Reads: 1,690
In today's uber-connected, consumer-centric, cloud-enabled, insights-driven, multi-device, global world, the focus of solutions has shifted from the product that is sold to the person who is buying the product or service. Enterprises have rebranded their business around the consumers of their products. The buyer is the person and the focus is not on the offering. The person is connected through multiple devices, wearables, at home, on the road, and in multiple locations, sometimes simultaneously...
Jul. 31, 2016 01:30 AM EDT Reads: 1,013
Basho Technologies has announced the latest release of Basho Riak TS, version 1.3. Riak TS is an enterprise-grade NoSQL database optimized for Internet of Things (IoT). The open source version enables developers to download the software for free and use it in production as well as make contributions to the code and develop applications around Riak TS. Enhancements to Riak TS make it quick, easy and cost-effective to spin up an instance to test new ideas and build IoT applications. In addition to...
Jul. 31, 2016 12:45 AM EDT Reads: 2,043
Identity is in everything and customers are looking to their providers to ensure the security of their identities, transactions and data. With the increased reliance on cloud-based services, service providers must build security and trust into their offerings, adding value to customers and improving the user experience. Making identity, security and privacy easy for customers provides a unique advantage over the competition.
Jul. 30, 2016 11:30 PM EDT Reads: 1,347
CenturyLink has announced that application server solutions from GENBAND are now available as part of CenturyLink’s Networx contracts. The General Services Administration (GSA)’s Networx program includes the largest telecommunications contract vehicles ever awarded by the federal government. CenturyLink recently secured an extension through spring 2020 of its offerings available to federal government agencies via GSA’s Networx Universal and Enterprise contracts. GENBAND’s EXPERiUS™ Application...
Jul. 30, 2016 10:30 PM EDT Reads: 1,947
"We've discovered that after shows 80% if leads that people get, 80% of the conversations end up on the show floor, meaning people forget about it, people forget who they talk to, people forget that there are actual business opportunities to be had here so we try to help out and keep the conversations going," explained Jeff Mesnik, Founder and President of ContentMX, in this SYS-CON.tv interview at 18th Cloud Expo, held June 7-9, 2016, at the Javits Center in New York City, NY.
Jul. 30, 2016 09:45 PM EDT Reads: 1,502
SYS-CON Events announced today that Isomorphic Software will exhibit at DevOps Summit at 19th International Cloud Expo, which will take place on November 1–3, 2016, at the Santa Clara Convention Center in Santa Clara, CA. Isomorphic Software provides the SmartClient HTML5/AJAX platform, the most advanced technology for building rich, cutting-edge enterprise web applications for desktop and mobile. SmartClient combines the productivity and performance of traditional desktop software with the simp...
Jul. 30, 2016 09:45 PM EDT Reads: 1,324
"When you think about the data center today, there's constant evolution, The evolution of the data center and the needs of the consumer of technology change, and they change constantly," stated Matt Kalmenson, VP of Sales, Service and Cloud Providers at Veeam Software, in this SYS-CON.tv interview at 18th Cloud Expo, held June 7-9, 2016, at the Javits Center in New York City, NY.
Jul. 30, 2016 09:30 PM EDT Reads: 1,538
Redis is not only the fastest database, but it is the most popular among the new wave of databases running in containers. Redis speeds up just about every data interaction between your users or operational systems. In his session at 19th Cloud Expo, Dave Nielsen, Developer Advocate, Redis Labs, will share the functions and data structures used to solve everyday use cases that are driving Redis' popularity.
Jul. 30, 2016 07:30 PM EDT Reads: 1,768
Internet of @ThingsExpo, taking place November 1-3, 2016, at the Santa Clara Convention Center in Santa Clara, CA, is co-located with the 19th International Cloud Expo and will feature technical sessions from a rock star conference faculty and the leading industry players in the world and ThingsExpo Silicon Valley Call for Papers is now open.
Jul. 30, 2016 07:00 PM EDT Reads: 2,779
You think you know what’s in your data. But do you? Most organizations are now aware of the business intelligence represented by their data. Data science stands to take this to a level you never thought of – literally. The techniques of data science, when used with the capabilities of Big Data technologies, can make connections you had not yet imagined, helping you discover new insights and ask new questions of your data. In his session at @ThingsExpo, Sarbjit Sarkaria, data science team lead ...
Jul. 30, 2016 05:00 PM EDT Reads: 1,313
[webcast] Continuous Delivery in the Enterprise | @DevOpsSummit @IBMDevOps #IBM #DevOps #ContinuousDelivery
To leverage Continuous Delivery, enterprises must consider impacts that span functional silos, as well as applications that touch older, slower moving components. Managing the many dependencies can cause slowdowns. See how to achieve continuous delivery in the enterprise.
Jul. 30, 2016 04:30 PM EDT Reads: 595
"My role is working with customers, helping them go through this digital transformation. I spend a lot of time talking to banks, big industries, manufacturers working through how they are integrating and transforming their IT platforms and moving them forward," explained William Morrish, General Manager Product Sales at Interoute, in this SYS-CON.tv interview at 18th Cloud Expo, held June 7-9, 2016, at the Javits Center in New York City, NY.
Jul. 30, 2016 04:30 PM EDT Reads: 2,284
Up until last year, enterprises that were looking into cloud services usually undertook a long-term pilot with one of the large cloud providers, running test and dev workloads in the cloud. With cloud’s transition to mainstream adoption in 2015, and with enterprises migrating more and more workloads into the cloud and in between public and private environments, the single-provider approach must be revisited. In his session at 18th Cloud Expo, Yoav Mor, multi-cloud solution evangelist at Cloudy...
Jul. 30, 2016 04:15 PM EDT Reads: 663
Aspose.Total for .NET is the most complete package of all file format APIs for .NET as offered by Aspose. It empowers developers to create, edit, render, print and convert between a wide range of popular document formats within any .NET, C#, ASP.NET and VB.NET applications. Aspose compiles all .NET APIs on a daily basis to ensure that it contains the most up to date versions of each of Aspose .NET APIs. If a new .NET API or a new version of existing APIs is released during the subscription peri...
Jul. 30, 2016 02:30 PM EDT Reads: 1,085
Security, data privacy, reliability, and regulatory compliance are critical factors when evaluating whether to move business applications from in-house, client-hosted environments to a cloud platform. Quality assurance plays a vital role in ensuring that the appropriate level of risk assessment, verification, and validation takes place to ensure business continuity during the migration to a new cloud platform.
Jul. 30, 2016 02:00 PM EDT Reads: 547
SYS-CON Events announced today that 910Telecom will exhibit at the 19th International Cloud Expo, which will take place on November 1–3, 2016, at the Santa Clara Convention Center in Santa Clara, CA. Housed in the classic Denver Gas & Electric Building, 910 15th St., 910Telecom is a carrier-neutral telecom hotel located in the heart of Denver. Adjacent to CenturyLink, AT&T, and Denver Main, 910Telecom offers connectivity to all major carriers, Internet service providers, Internet backbones and ...
Jul. 30, 2016 01:30 PM EDT Reads: 992
Ovum, a leading technology analyst firm, has published an in-depth report, Ovum Decision Matrix: Selecting a DevOps Release Management Solution, 2016–17. The report focuses on the automation aspects of DevOps, Release Management and compares solutions from the leading vendors.
Jul. 30, 2016 01:00 PM EDT Reads: 1,868
Continuous testing helps bridge the gap between developing quickly and maintaining high quality products. But to implement continuous testing, CTOs must take a strategic approach to building a testing infrastructure and toolset that empowers their team to move fast. Download our guide to laying the groundwork for a scalable continuous testing strategy.
Jul. 30, 2016 01:00 PM EDT Reads: 2,139
Adding public cloud resources to an existing application can be a daunting process. The tools that you currently use to manage the software and hardware outside the cloud aren’t always the best tools to efficiently grow into the cloud. All of the major configuration management tools have cloud orchestration plugins that can be leveraged, but there are also cloud-native tools that can dramatically improve the efficiency of managing your application lifecycle. In his session at 18th Cloud Expo, ...
Jul. 30, 2016 12:00 PM EDT Reads: 1,384