|By Jon Shende||
|August 31, 2010 07:45 AM EDT||
[Adopted from my BLOG December 2009]
Lately in the IT community all the hype is on Cloud Computing. We have small start-ups offering several variations of Cloud services as well as some of the established big players (Google, Amazon, IBM, Novell (aimed at cloud service providers),Sun) stepping up their offerings of cloud services.
But what exactly is Cloud Computing? Is it Virtualization? Is it services that we accessed via a web browser over the years, something totally new, or is it all of these,but just rebranded?
The term Cloud Computing started gaining traction when Google and IBM launched a university initiative to address internet scale computing back in 2007.
These services has been evolving since the 90s and its previous incarnations can be said to be Grid and Utility computing and the Software as a Service offerings we saw around a decade ago.
Cloud Computing Journal - the Web's most widely Web resource on Cloud Computing
In a nutshell we can draw an analogy which can be stated as this: think of a utility service you use, say for example electricity. You get your meter read every few weeks and you receive a bill for energy consumed between readings.
The same underlying premise can be applied to a cloud service, an end user can subscribe for any of the offered cloud services and based on service usage from the provider be billed for consumption of that particular service or series of services for its specified time-frame.
Once can safely state that Cloud Computing as an on-demand, self-service, pay-as you go utility, evolved from a combination of grid computing, virtualization, and automation.
Experts estimate that this industry will grow to a 42 billion dollar industry by 2012, however the implementation and usage of cloud computing models and services is not without issues.
Most business managers will most likely consider the Capex and Opex aspects, especially in this economy. How much money an IT department can save yet still maintain operational efficiencies and security is a primary focus; by implementing one or more cloud computing services, an enterprise can obtain the scale and flexibility it needs and potentially save time as well with the concepts of dynamic provisioning of needed services.
One Cloud Computing claim is to lower costs, increase business agility and help increase the velocity at which applications can be deployed, however a good question to consider is can one expect its implementation to be disruptive and to what length?
In order to engage cloud computing services, business models will have to be adjusted or downright changed, in order to effectively and efficiently managing the utility aspect of computing power used in everyday operations and the manner in which management will be able to utilize resources.
As with any implementation, standards and regulation needs to be formulated and implemented in order to ensure that both vendor and the tenant are in compliance and within governance of an agreed format of policies.
As of now there are no formal standards directed solely toward cloud computing however NIST has proposed a potential framework standard called Standards Acceleration to Jumpstart Adoption of Cloud Computing (SAJAAC). With this, every effort should be made to ensure the confidentiality, availability and integrity of data held within a cloud computing environment going forward.
The National Institute for Science and Technology (NIST) defines cloud computing as "a pay-per-use model for enabling available, convenient, on-demand network access to a shared pool of configurable computing resources (e.g., networks, servers, storage, applications, services) that can be rapidly provisioned and released with minimal management effort or service provider interaction." 
Cloud Computing Models: Cloud models can be one of the following three:
- Infrastructure as a Service (IaaS) e.g. Tier 3, Amazon EC2,while the subscriber does not control the cloud infrastructure they do have control over select portions of network e.g. firewalls, operating system, deployed applications and storage.
- Platform as a Service (PaaS) this goes back to the early 70's when it was referred to as Framework as a Service. What is does is simply to provide different combinations of services to a subscriber supporting an application development life-cycle e.g. Google's App Engine which will let a subscriber run web applications on Google's infrastructure or Azure. In essence the subscriber will use programming(.Net Java python) and tools supplied by the service provider with no underlying responsibility for the cloud deployed network, severs, operating system and storage etc.
- Software as a Service (SaaS) e.g. Facebook, Salesforce.com, applications running on a cloud infrastructure that can be accessed via a web browser interface.
Bear in mind that there can be dependencies and a relationship between the models as Infrastructure as a Service can be stated as the foundation of Cloud Computing services, upon which Platform as a Service and then Software as a Service is built upon.
these services can be implemented by the end user in four different manners :
1) Private Cloud aka a corporate cloud refers to proprietary computing architecture providing hosted services to a limited number of people which resides behind a corporate firewall, in other words a single tenant.
By using private clouds enterprises/tenants will receive the same economies of scale and bi-directional scaling as that of the public cloud user.
However being a single enterprise or division within that enterprise will ensure the additional benefit of more control and security for data held within the private cloud, as on-premise data centers can be converted into private clouds by implementing virtualization technologies from companies such as Microsoft, Citrix, VMware, Novell and Sun.
2) Public Cloud is the cloud that is provided for lease external to an entity's physical location e.g. Amazon's EC2.
This deployment facilitates rapid scaling via virtualization technologies (which enables cloud user resources an ability to rapidly start up and shut down,) and can be utilized by multiple tenants however; within this deployment users have no access to dedicated resources.
This results in users giving up a certain amount of control over the process, which in turn can raise security and compliance issues.
3) Hybrid Cloud is a mixture of the public and private. This can be appealing for a company that chose to store non confidential data externally say using Simple Storage Service (S3) whilst keeping private data in-house.
4) Managed Cloud -In this offering the physical infrastructure in operation is owned by the subscriber and can be housed within the physical premises of the subscriber.
However the service provider will control portions of management and security of the service utility.
Some Deployment Concerns
As with the deployment of any IT system there will be challenges and cause for concern. Certain scenarios will have to be anticipated and use cases as well as processes to mitigate these concerns need to be clarified. Some examples of concerns are as follows:
1) Software licenses: software is typically subscribed as those that are proprietary licensed or those that are free and open source licensed. Software licenses govern usage or redistribution of software which are in most cases copyright protected.
Something to consider is how software that you want to deploy into a cloud is licensed.
Is the software you want to deploy licensed on a per server basis or not and how will easy or difficult will deploying your software of choice into a cloud be?
Will proprietary software solutions need to be confined to dedicated hosting environments? Most likely yes, at least for now or until you can get a vendor who can securely provide the software you need on a pay as you go basis.
Because of how software licensing is structured early cloud users have been found to use more open source software.
2) Single point of failure: a mission critical application is deployed via a single vendor; issues at the vendor's site may severely impact the availability of resources for the tenant.
The vendor may claim to have multiple, remote backup locations completely powered However when it comes to ensuring that, in addition, redundant cloud administration and infrastructure software are in place, the vendor may fall short.
3) Portability: a cause for concern is that each vendor may utilize different applications APIs and formats for data. This in turn may limit application and data portability to other environments, as they are likely to be using proprietary APIs thus causing a "Lock In" situation where it will be easy to sign a contract and use a vendor service however transferring out to another vendor could have major issues.
Of course as the cloud computing environment evolves this may soon be remedied.
4) Security: the elephant in the cloud room. The most common fear with using a cloud deployment is a loss of control and security of data.
Granted this is still a system built on hard and software platforms and as such is still susceptible to the traditional security attacks (DOS, DDOS etc.), conversely a point for consideration should be that any security measure will be more cost effective when implemented on a larger scale.
Any good IT manager has voiced concerns over whether employees/administrators at the cloud provider can be trusted to not look at data or even modify it or, whether other customers sharing the cloud can hack data or access it without leaving an audit trail.
From this a tenant can ask about methods the vendor is employing to protect data such as high physical security as well as what types of monitoring, intrusion detection and firewall equipment are in place at their centers.
Even worse is whether competitors could find out sensitive information such as customer orders, pricing and cost information, and negatively impact business. And of course what about privacy concerns and government regulations?
Other issues of concern can be:
- What levels of protection in place to protect one customer from accessing another customer's data or application within a shared cloud space?
- Who will be liable for security breaches and how will the law regarding this in any one jurisdiction ensure compliance?
- How well will a vendor system integrate with a tenant's security systems?
5) Scalability: Every user/potential user of the cloud constantly hear of the substantial savings they will realize by utilizing cloud-based resources.
In order to take full advantage of the scalability of the cloud there should be a means of ensuring that there is some form of dynamic measurement and resource management for applications held within a cloud.
Scalability within the cloud can be had by composing the service from other scalable services as can be seen with Google App Engine.
6) Auditing: With the cloud one has to consider how compliance with ISO standards, Sarbanes-Oxley, HIPAA ,PCI-DSS etc. will impact certain data from being deployed. This more so when considering the attractiveness of data to unauthorized entities and the methods they could use to gain access to that data.
Any IT manager will also tell you that without proper planning the cost of an audit can be higher than expected.
As of this writing, I am not aware of any formulated standards for auditing within the cloud, however I must state that for a business, auditing within the cloud may be an attractive option as, this can be done live with no down time or interruption to business processes.
7) Compliance: There are no standards in place as of yet, but the National Institute of Standards and Technology (NIST )and others are working toward that end.
8) Other Data Access: what happens to my data if the vendor revokes my access or there is a system malfunction? This is a common question that anyone thinking of using the cloud should ask. Remember the data loss for T-mobile customers using the services of Microsoft subsidiary - Danger? "Microsoft said any data that users had on their devices and is no longer there has almost certainly been permanently lost" Here there was no revocation of access but an alleged system glitch.
A personal example occurred a few days ago when I tried to access a Gmail account I kept just for research and online backup.
The system message intimated that I had violated the "Terms of Agreement". What?! The Gmail account was hardly ever used to send email and the Google docs account was used as a second online backup for some of my documents and files. If this was not a secondary backup or not a backup at all, I would have lost access all my uploaded documents and files, with no recourse for resolution but filling out a form and hoping for contact from the support center.
In order to address and mitigate these issues the tenant should ensure that workarounds and backup plans are worked into their Service Level Agreements (SLAs) with the vendor.
Service Level Agreements (SLAs)
A service level agreement a part of a service contract where the level of service is formally defined. In practice, the term is sometimes used to refer to the contracted delivery time (of the service) or performance.  Whilst there may not be much flexibility with a vendor in defining an SLA, I am confident that the laws of supply and demand will shift this more toward the tenant in the near future.
Cloud computing vendors are getting into this business to affect their bottom line and shareholder value if publicly traded. At the end of the day their focus will be on making a profit on services offered.
In light of this most tenants may feel as though they are getting into an arrangement where it appears as though vendors create the SLAs for their own protection against litigation, with minimal assurances to a tenant.
That being said, this does not mean that an IT manager cannot make the SLA work as a tool to chose an appropriate service provider. An IT manager's main concern will be the security of data and of course, the traditional interpretation of the CIA triad (Confidentiality, Integrity and Availability) may not be applicable within their cloud service.
To start an IT manager can focus on the following when hammering out their SLA with a vendor:
1) Data Protection: where there is a clear definition as to who will have access to the data and the levels of protection in effect for their data.
Some questions that can be asked are:
- How will data be encrypted?
- How will compliance be addressed?
- What are the levels of access control?
- Will there be sub-contractors or third party providers processing the data?
- Where are backups stored?
- How is the data center secured?
- What happens to the data if service providers are switched?
- What processes are in place to mitigate legal inquiries about a customer's data?
- How often are audits done and what types of auditing tools are in place?
- What happens to my data if there is an investigation taking place on another tenant sharing services and how will you ensure my access to my data in the event of equipment seizure by federal entities?
- How is data deletion handled?
2) Continuity: one has to consider what happens in the event of an outage or another related event that causes data to become unavailable.
Some questions to consider here are
- How will the vendor define a services outage?
- Will there be scheduled vendor downtime for maintenance etc.?
- Will there be an alternative vendor hot site or vendor site prepped to take on load of access in the event of a vendor outage?
- Are there tools in motion which will determine the severity of a vendor outage?
- How will the tenant be compensated in the event of a vendor an outage?
- Define levels of redundancy in place to minimize vendor outages?
3) Costs: on cost to consider are:
- How is the vendor's fee structured and is taxes and external fees accounted for in a vendor quote?
- Will there be or are there current licensing fees above and beyond stated vendor service fees?
- Will there be any hidden or add on costs for vendor support?
- How does the vendor structure their charges? Is it based upon usage, traffic or storage limit
- Does the vendor offer price protection?
It is expected that Cloud Computing will the wave of the future in terms of computing, it is only logical that the cloud's economies of scale and flexibility will impact how technology evolves and how users of technologies implement these technologies.
However in terms of security the massive availability of resources and data within a cloud does present a very attractive target for attackers.
That being said, we can assume that cloud-based defences may be more robust, scalable and cost-effective, in an effort to mitigate security concerns regarding multiple tenants, encryption, trust and compliance.
Part of a cloud service is the API. However when it comes to integration between vendors this may pose a problem for tenants, given the fact that cloud APIs are not yet standardized. This means that each vendor has a specific APIs for managing its services that will lock customers to their vendors due to vendor proprietary technology.
The work around here would be to look for vendors that use standard APIs wherever possible. This is a viable option as standard APIs are already implemented for access to storage as well as deploying and scaling applications.
In terms of auditing and forensics, dedicated, pay-per-use forensic images of virtual machines can be obtained by an auditor without having to take infrastructure offline. This of course results in less down-time for auditing as well as it can provide cost-effective storage for logs without deterring system performance.
All of which will increase the return on investment as well as decrease operational costs normally involved with in house systems processing the same data as in the cloud.
Of course Cloud Computing is still in its infancy and whilst some proposals may look good in theory, only time will tell how we proceed and evolve with this system of computing.
 Cloud Connect
 Cloud Security Alliance
Cloud Computing journal
European Network and Information Security Agency.
Eighty percent of a data scientist’s time is spent gathering and cleaning up data, and 80% of all data is unstructured and almost never analyzed. Cognitive computing, in combination with Big Data, is changing the equation by creating data reservoirs and using natural language processing to enable analysis of unstructured data sources. This is impacting every aspect of the analytics profession from how data is mined (and by whom) to how it is delivered. This is not some futuristic vision: it's ha...
Feb. 13, 2016 04:45 AM EST Reads: 465
The cloud promises new levels of agility and cost-savings for Big Data, data warehousing and analytics. But it’s challenging to understand all the options – from IaaS and PaaS to newer services like HaaS (Hadoop as a Service) and BDaaS (Big Data as a Service). In her session at @BigDataExpo at @ThingsExpo, Hannah Smalltree, a director at Cazena, will provide an educational overview of emerging “as-a-service” options for Big Data in the cloud. This is critical background for IT and data profes...
Feb. 13, 2016 03:45 AM EST Reads: 241
Father business cycles and digital consumers are forcing enterprises to respond faster to customer needs and competitive demands. Successful integration of DevOps and Agile development will be key for business success in today’s digital economy. In his session at DevOps Summit, Pradeep Prabhu, Co-Founder & CEO of Cloudmunch, covered the critical practices that enterprises should consider to seamlessly integrate Agile and DevOps processes, barriers to implementing this in the enterprise, and pr...
Feb. 13, 2016 03:00 AM EST Reads: 463
Sensors and effectors of IoT are solving problems in new ways, but small businesses have been slow to join the quantified world. They’ll need information from IoT using applications as varied as the businesses themselves. In his session at @ThingsExpo, Roger Meike, Distinguished Engineer, Director of Technology Innovation at Intuit, showed how IoT manufacturers can use open standards, public APIs and custom apps to enable the Quantified Small Business. He used a Raspberry Pi to connect sensors...
Feb. 13, 2016 02:30 AM EST Reads: 371
Let’s face it, embracing new storage technologies, capabilities and upgrading to new hardware often adds complexity and increases costs. In his session at 18th Cloud Expo, Seth Oxenhorn, Vice President of Business Development & Alliances at FalconStor, will discuss how a truly heterogeneous software-defined storage approach can add value to legacy platforms and heterogeneous environments. The result reduces complexity, significantly lowers cost, and provides IT organizations with improved effi...
Feb. 13, 2016 12:45 AM EST Reads: 276
It's easy to assume that your app will run on a fast and reliable network. The reality for your app's users, though, is often a slow, unreliable network with spotty coverage. What happens when the network doesn't work, or when the device is in airplane mode? You get unhappy, frustrated users. An offline-first app is an app that works, without error, when there is no network connection.
Feb. 12, 2016 10:00 PM EST Reads: 245
Data-as-a-Service is the complete package for the transformation of raw data into meaningful data assets and the delivery of those data assets. In her session at 18th Cloud Expo, Lakshmi Randall, an industry expert, analyst and strategist, will address: What is DaaS (Data-as-a-Service)? Challenges addressed by DaaS Vendors that are enabling DaaS Architecture options for DaaS
Feb. 12, 2016 09:45 PM EST Reads: 386
Companies can harness IoT and predictive analytics to sustain business continuity; predict and manage site performance during emergencies; minimize expensive reactive maintenance; and forecast equipment and maintenance budgets and expenditures. Providing cost-effective, uninterrupted service is challenging, particularly for organizations with geographically dispersed operations.
Feb. 12, 2016 06:00 PM EST
SYS-CON Events announced today that Catchpoint Systems, Inc., a provider of innovative web and infrastructure monitoring solutions, has been named “Silver Sponsor” of SYS-CON's DevOps Summit at 18th Cloud Expo New York, which will take place June 7-9, 2016, at the Javits Center in New York City, NY. Catchpoint is a leading Digital Performance Analytics company that provides unparalleled insight into customer-critical services to help consistently deliver an amazing customer experience. Designed...
Feb. 12, 2016 06:00 PM EST Reads: 404
When building large, cloud-based applications that operate at a high scale, it’s important to maintain a high availability and resilience to failures. In order to do that, you must be tolerant of failures, even in light of failures in other areas of your application. “Fly two mistakes high” is an old adage in the radio control airplane hobby. It means, fly high enough so that if you make a mistake, you can continue flying with room to still make mistakes. In his session at 18th Cloud Expo, Lee...
Feb. 12, 2016 04:45 PM EST
With the proliferation of both SQL and NoSQL databases, organizations can now target specific fit-for-purpose database tools for their different application needs regarding scalability, ease of use, ACID support, etc. Platform as a Service offerings make this even easier now, enabling developers to roll out their own database infrastructure in minutes with minimal management overhead. However, this same amount of flexibility also comes with the challenges of picking the right tool, on the right ...
Feb. 12, 2016 04:30 PM EST Reads: 193
SYS-CON Events announced today that Interoute, owner-operator of one of Europe's largest networks and a global cloud services platform, has been named “Bronze Sponsor” of SYS-CON's 18th Cloud Expo, which will take place on June 7-9, 2015 at the Javits Center in New York, New York. Interoute is the owner-operator of one of Europe's largest networks and a global cloud services platform which encompasses 12 data centers, 14 virtual data centers and 31 colocation centers, with connections to 195 ad...
Feb. 12, 2016 04:15 PM EST Reads: 432
Join us at Cloud Expo | @ThingsExpo 2016 – June 7-9 at the Javits Center in New York City and November 1-3 at the Santa Clara Convention Center in Santa Clara, CA – and deliver your unique message in a way that is striking and unforgettable by taking advantage of SYS-CON's unmatched high-impact, result-driven event / media packages.
Feb. 12, 2016 03:00 PM EST
SYS-CON Events announced today that Commvault, a global leader in enterprise data protection and information management, has been named “Bronze Sponsor” of SYS-CON's 18th International Cloud Expo, which will take place on June 7–9, 2016, at the Javits Center in New York City, NY, and the 19th International Cloud Expo, which will take place on November 1–3, 2016, at the Santa Clara Convention Center in Santa Clara, CA. Commvault is a leading provider of data protection and information management...
Feb. 12, 2016 02:15 PM EST Reads: 457
There will be new vendors providing applications, middleware, and connected devices to support the thriving IoT ecosystem. This essentially means that electronic device manufacturers will also be in the software business. Many will be new to building embedded software or robust software. This creates an increased importance on software quality, particularly within the Industrial Internet of Things where business-critical applications are becoming dependent on products controlled by software. Qua...
Feb. 12, 2016 02:15 PM EST
How Best to Integrate Cloud Foundry into Your Existing Ecosystem By @Gidrontxt | @DevOpsSummit #DevOps
As someone who has been dedicated to automation and Application Release Automation (ARA) technology for almost six years now, one of the most common questions I get asked regards Platform-as-a-Service (PaaS). Specifically, people want to know whether release automation is still needed when a PaaS is in place, and why. Isn't that what a PaaS provides? A solution to the deployment and runtime challenges of an application? Why would anyone using a PaaS then need an automation engine with workflow ...
Feb. 12, 2016 01:45 PM EST Reads: 237
With an estimated 50 billion devices connected to the Internet by 2020, several industries will begin to expand their capabilities for retaining end point data at the edge to better utilize the range of data types and sheer volume of M2M data generated by the Internet of Things. In his session at @ThingsExpo, Don DeLoach, CEO and President of Infobright, will discuss the infrastructures businesses will need to implement to handle this explosion of data by providing specific use cases for filte...
Feb. 12, 2016 01:00 PM EST Reads: 237
SYS-CON Events announced today that Pythian, a global IT services company specializing in helping companies adopt disruptive technologies to optimize revenue-generating systems, has been named “Bronze Sponsor” of SYS-CON's 18th Cloud Expo, which will take place on June 7-9, 2015 at the Javits Center in New York, New York. Founded in 1997, Pythian is a global IT services company that helps companies compete by adopting disruptive technologies such as cloud, Big Data, advanced analytics, and DevO...
Feb. 12, 2016 12:30 PM EST Reads: 269
SYS-CON Events announced today that Avere Systems, a leading provider of enterprise storage for the hybrid cloud, will exhibit at SYS-CON's 18th International Cloud Expo®, which will take place on June 7-9, 2016, at the Javits Center in New York City, NY. Avere delivers a more modern architectural approach to storage that doesn’t require the overprovisioning of storage capacity to achieve performance, overspending on expensive storage media for inactive data or the overbuilding of data centers ...
Feb. 12, 2016 12:30 PM EST Reads: 121
SYS-CON Events announced today that FalconStor Software® Inc., a 15-year innovator of software-defined storage solutions, will exhibit at SYS-CON's 18th International Cloud Expo®, which will take place on June 7-9, 2016, at the Javits Center in New York City, NY. FalconStor Software®, Inc. (NASDAQ: FALC) is a leading software-defined storage company offering a converged, hardware-agnostic, software-defined storage and data services platform. Its flagship solution FreeStor®, utilizes a horizonta...
Feb. 12, 2016 11:45 AM EST