Welcome!

@CloudExpo Authors: Ruxit Blog, Pat Romanski, Elizabeth White, Liz McMillan, Lori MacVittie

Blog Feed Post

Can the Cloud survive regulation?

One of the greatest strengths of the Cloud is that, like the Internet, it knows no boundaries. It crosses industry and international boundaries as if they do not exist. But as is often the case, your greatest strength can also be your greatest weakness.

Take Google, for example, and it’s myriad Cloud-based application offerings. A new complaint made by google Epic (Electronic Privacy Information Center) to the US Federal Trade Commission urges the regulatory agency to “consider shutting down Google’s services until it establishes safeguards for protecting confidential information.” 

From a recent FT.com article:

In a 15-page complaint to the FTC, the Electronic Privacy Information Center (Epic) said recent reports suggested Google did not adequately protect the data it obtained. It cited vulnerabilities that revealed users' data in its Gmail webmail service, Google Docs online word processing and spreadsheets and in Google Desktop, which can index users' information held on their hard drives.

Google said it had not reviewed the filing in detail but it had "extensive policies, procedures and technologies in place to ensure the highest levels of data protection".

Privacy is mentioned as the primary concern, but reliability, too, is also mentioned as problematic in  the face of recent well-covered outages of the search-engine giant’s services. A recent nearly 24 hour windows_azure_smalloutage of Microsoft’s Azure, though admittedly of a pre-release cloud (is there really such a thing?), is certain to be cited as well as proof of the problems with reliability of cloud-based services.

Security professionals have questioned the security of the cloud, and of its suitability for applications falling under certain governmental regulations like HIPAA and BASEL II, as well as compliance with industry standard protections like PCI DSS.

GLOBAL CONCERN

What we see beginning to happen is that the cloud, with its lack of boundaries and recognition for industry and national boundaries, may fall subject to myriad – potentially conflicting – regulations regarding privacy and compliance. The US is certainly concerned with privacy, but in recent years the UK and European Union in general has surpassed even its national culture of concern regarding privacy.

Many of the EU laws and regulations regarding privacy are tougher than those in the US and elsewhere in the world, and the collision of these regulations may in fact cause cloud providers to reconsider  their global scope. Indeed, even conflicting requirements across industries may be enough to warrant something akin to the creation of “niche” clouds; cloud centers serving specific segments of industry based on the need for compliance with specific regulations both in the US and abroad.

A generalized cloud may not be able to serve all industries or all countries if regulations conflict without severely impacting the ability of other industries and countries to take advantage of the shared resources of the cloud.

Regulations around privacy and protection of data go deeper than the surface, the application. The toughest of regulations require certification of compliance from the application down to the hardware; through the infrastructure. It is at the infrastructure layer – the servers, virtualization implementation, routers, switches, and application delivery network – that the impact of compliance and regulations may be felt by industries and countries for whom these regulations are not a concern.

SHARING MORE THAN RESOURCES

While certain it appears on the surface that additional security and privacy mechanisms in the cloud would be a good thing for all customers, it is the impact that security and privacy implementations can have on the performance and capacity of the cloud that may actually increase the costs to everyone attempting to leverage cloud computing services.

Because the cloud is a shared environment, providers like Google and Microsoft must necessarily be aware that while today a given set of servers and infrastructure is serving up Bob’s Web 2.0 Social Networking and Microblogging Application, tomorrow – or in the next hour – it may be required to run cloudweban application that is more sensitive in terms of privacy and confidentiality, such as health records. While the applicability of regulations such as HIPAA to user initiated storage and transfer of records has rarely been discussed yet, it is only a matter of time before privacy concerns are raised regarding this type of personally identifiable information.

Even a strategy as simple as instituting SSL everywhere in the cloud, to ensure the private transfer of data regardless of its need to comply with governmental and industry regulation, can have a negative effect. The additional compute processing required to handle SSL can ultimately be the cause of degraded performance and capacity on servers, meaning Bob may need to pay for additional instances in order to maintain a level of performance and user concurrency with which he is satisfied. Additional instances cost money, the cloud ain’t really free, and the impact of regulations begins to be felt by everyone.

Financial services, who seem an unlikely customer of the cloud, are highly sensitized to the impact of latency and outages on their business. The additional burden of privacy and security implementations throughout the cloud infrastructure may very well make the cloud a truly hostile environment for such organizations, such that they will never adopt cloud as a viable alternative. Health care and related industries fall under the heavy-handed strictures set down by government regulations such as HIPAA in the US, requiring specific security related to the transfer of personally identifiable information that is not necessarily addressed by today’s cloud computing providers, Google Health not withstanding.

The effects of additional infrastructure and solutions and even cloud architecture designed to appease the needs of governments and industries will affect every user of the cloud, necessarily, because it’s a shared environment. Isolation of traffic, encryption, secure logs, audit trails, and other security and privacy related solutions must be universally applied because the resources within the cloud are ostensibly universally used. Whether an application needs it or not, whether the user wants it or not, becomes irrelevant because it is the cloud provider who is now participating in the compliance process and it must ensure that it meets the demands of regulations imposed across industries and international boundaries. 

THE RISE of the REGULATED CLOUD?

It may be that we will see the rise of regulated clouds; clouds within clouds specifically designed to meet the demanding needs of the myriad governmental and industry-specific privacy and data protection regulations. Regulated clouds set aside – at a premium of course – for those users and organizations who require a broader set of solutions to remain compliant even in the cloud.

The alternative is, of course, to implement a cloud architecture comprising an infrastructure and solutions designed to meet the most demanding of regulations and industry-specific needs. Doing so ensures that all users, regardless of which regulations they may fall under, are covered and need not worry about compliance. But the cost of doing so will not be trivial, and is sure to be passed on to all users one way or another. Such implementations would surely be explained away as “benefits” to all users (See? You get security and data protection *for free*!) but the reality is that the cost will be hidden in degraded capacity and performance that ultimately raise the long-term costs of doing business in the cloud.

With demands from organizations like Epic to shut down Google, and concerns raised by multiple industries on the reliability and security of the cloud in general, we are just beginning to see the impact of what sharing and “international” really means: an increasingly complex web of requirements and regulations. That may very well make the cloud a battle-zone unsuitable for any organizational use until the conflicts between security, regulations, reliability, and privacy are addressed.

Follow me on Twitter View Lori's profile on SlideShare friendfeedicon_facebook AddThis Feed Button Bookmark and Share

Reblog this post [with Zemanta]

Read the original blog entry...

More Stories By Lori MacVittie

Lori MacVittie is responsible for education and evangelism of application services available across F5’s entire product suite. Her role includes authorship of technical materials and participation in a number of community-based forums and industry standards organizations, among other efforts. MacVittie has extensive programming experience as an application architect, as well as network and systems development and administration expertise. Prior to joining F5, MacVittie was an award-winning Senior Technology Editor at Network Computing Magazine, where she conducted product research and evaluation focused on integration with application and network architectures, and authored articles on a variety of topics aimed at IT professionals. Her most recent area of focus included SOA-related products and architectures. She holds a B.S. in Information and Computing Science from the University of Wisconsin at Green Bay, and an M.S. in Computer Science from Nova Southeastern University.

@CloudExpo Stories
Keeping pace with advancements in software delivery processes and tooling is taxing even for the most proficient organizations. Point tools, platforms, open source and the increasing adoption of private and public cloud services requires strong engineering rigor – all in the face of developer demands to use the tools of choice. As Agile has settled in as a mainstream practice, now DevOps has emerged as the next wave to improve software delivery speed and output. To make DevOps work, organization...
Let’s face it, embracing new storage technologies, capabilities and upgrading to new hardware often adds complexity and increases costs. In his session at 18th Cloud Expo, Seth Oxenhorn, Vice President of Business Development & Alliances at FalconStor, discussed how a truly heterogeneous software-defined storage approach can add value to legacy platforms and heterogeneous environments. The result reduces complexity, significantly lowers cost, and provides IT organizations with improved efficienc...
SYS-CON Events announced today that Isomorphic Software will exhibit at DevOps Summit at 19th International Cloud Expo, which will take place on November 1–3, 2016, at the Santa Clara Convention Center in Santa Clara, CA. Isomorphic Software provides the SmartClient HTML5/AJAX platform, the most advanced technology for building rich, cutting-edge enterprise web applications for desktop and mobile. SmartClient combines the productivity and performance of traditional desktop software with the simp...
"We've discovered that after shows 80% if leads that people get, 80% of the conversations end up on the show floor, meaning people forget about it, people forget who they talk to, people forget that there are actual business opportunities to be had here so we try to help out and keep the conversations going," explained Jeff Mesnik, Founder and President of ContentMX, in this SYS-CON.tv interview at 18th Cloud Expo, held June 7-9, 2016, at the Javits Center in New York City, NY.
"When you think about the data center today, there's constant evolution, The evolution of the data center and the needs of the consumer of technology change, and they change constantly," stated Matt Kalmenson, VP of Sales, Service and Cloud Providers at Veeam Software, in this SYS-CON.tv interview at 18th Cloud Expo, held June 7-9, 2016, at the Javits Center in New York City, NY.
Internet of @ThingsExpo, taking place November 1-3, 2016, at the Santa Clara Convention Center in Santa Clara, CA, is co-located with the 19th International Cloud Expo and will feature technical sessions from a rock star conference faculty and the leading industry players in the world and ThingsExpo Silicon Valley Call for Papers is now open.
The IoT is changing the way enterprises conduct business. In his session at @ThingsExpo, Eric Hoffman, Vice President at EastBanc Technologies, discussed how businesses can gain an edge over competitors by empowering consumers to take control through IoT. He cited examples such as a Washington, D.C.-based sports club that leveraged IoT and the cloud to develop a comprehensive booking system. He also highlighted how IoT can revitalize and restore outdated business models, making them profitable ...
In his session at @DevOpsSummit at 19th Cloud Expo, Yoseph Reuveni, Director of Software Engineering at Jet.com, will discuss Jet.com's journey into containerizing Microsoft-based technologies like C# and F# into Docker. He will talk about lessons learned and challenges faced, the Mono framework tryout and how they deployed everything into Azure cloud. Yoseph Reuveni is a technology leader with unique experience developing and running high throughput (over 1M tps) distributed systems with extre...
To leverage Continuous Delivery, enterprises must consider impacts that span functional silos, as well as applications that touch older, slower moving components. Managing the many dependencies can cause slowdowns. See how to achieve continuous delivery in the enterprise.
"delaPlex is a software development company. We do team-based outsourcing development," explained Mark Rivers, COO and Co-founder of delaPlex Software, in this SYS-CON.tv interview at 18th Cloud Expo, held June 7-9, 2016, at the Javits Center in New York City, NY.
We all know the latest numbers: Gartner, Inc. forecasts that 6.4 billion connected things will be in use worldwide in 2016, up 30 percent from last year, and will reach 20.8 billion by 2020. We're rapidly approaching a data production of 40 zettabytes a day – more than we can every physically store, and exabytes and yottabytes are just around the corner. For many that’s a good sign, as data has been proven to equal money – IF it’s ingested, integrated, and analyzed fast enough. Without real-ti...
"There's a growing demand from users for things to be faster. When you think about all the transactions or interactions users will have with your product and everything that is between those transactions and interactions - what drives us at Catchpoint Systems is the idea to measure that and to analyze it," explained Leo Vasiliou, Director of Web Performance Engineering at Catchpoint Systems, in this SYS-CON.tv interview at 18th Cloud Expo, held June 7-9, 2016, at the Javits Center in New York Ci...
As companies gain momentum, the need to maintain high quality products can outstrip their development team’s bandwidth for QA. Building out a large QA team (whether in-house or outsourced) can slow down development and significantly increases costs. This eBook takes QA profiles from 5 companies who successfully scaled up production without building a large QA team and includes: What to consider when choosing CI/CD tools How culture and communication can make or break implementation
Actian Corporation has announced the latest version of the Actian Vector in Hadoop (VectorH) database, generally available at the end of July. VectorH is based on the same query engine that powers Actian Vector, which recently doubled the TPC-H benchmark record for non-clustered systems at the 3000GB scale factor (see tpc.org/3323). The ability to easily ingest information from different data sources and rapidly develop queries to make better business decisions is becoming increasingly importan...
A critical component of any IoT project is what to do with all the data being generated. This data needs to be captured, processed, structured, and stored in a way to facilitate different kinds of queries. Traditional data warehouse and analytical systems are mature technologies that can be used to handle certain kinds of queries, but they are not always well suited to many problems, particularly when there is a need for real-time insights.
Redis is not only the fastest database, but it is the most popular among the new wave of databases running in containers. Redis speeds up just about every data interaction between your users or operational systems. In his session at 19th Cloud Expo, Dave Nielsen, Developer Advocate, Redis Labs, will share the functions and data structures used to solve everyday use cases that are driving Redis' popularity.
You think you know what’s in your data. But do you? Most organizations are now aware of the business intelligence represented by their data. Data science stands to take this to a level you never thought of – literally. The techniques of data science, when used with the capabilities of Big Data technologies, can make connections you had not yet imagined, helping you discover new insights and ask new questions of your data. In his session at @ThingsExpo, Sarbjit Sarkaria, data science team lead ...
Is your aging software platform suffering from technical debt while the market changes and demands new solutions at a faster clip? It’s a bold move, but you might consider walking away from your core platform and starting fresh. ReadyTalk did exactly that. In his General Session at 19th Cloud Expo, Michael Chambliss, Head of Engineering at ReadyTalk, will discuss why and how ReadyTalk diverted from healthy revenue and over a decade of audio conferencing product development to start an innovati...
"Software-defined storage is a big problem in this industry because so many people have different definitions as they see fit to use it," stated Peter McCallum, VP of Datacenter Solutions at FalconStor Software, in this SYS-CON.tv interview at 18th Cloud Expo, held June 7-9, 2016, at the Javits Center in New York City, NY.
Extracting business value from Internet of Things (IoT) data doesn’t happen overnight. There are several requirements that must be satisfied, including IoT device enablement, data analysis, real-time detection of complex events and automated orchestration of actions. Unfortunately, too many companies fall short in achieving their business goals by implementing incomplete solutions or not focusing on tangible use cases. In his general session at @ThingsExpo, Dave McCarthy, Director of Products...