Click here to close now.

Welcome!

Cloud Expo Authors: William Schmarzo, Kevin Jackson, Liz McMillan, Elizabeth White, Wayne Lam

Related Topics: Big Data Journal, Java, Linux, Virtualization, Cloud Expo, Security

Big Data Journal: Article

Protecting Data in the Cloud

Today’s cloud-driven, always-connected world enables organizations to be very agile but is also putting data integrity at risk

The cloud plays an integral role in enabling the agility required to take advantage of new business models and to do so in a very convenient and cost-effective way. However, this also means that more personal information and business data will exist in the cloud and be passed back and forth. Maintaining data integrity is paramount.

Today's approach to security in the cloud may not be sufficient; it doesn't focus on putting controls close to data, which is now more fluid, and it doesn't discriminate one set of data from another. All data is not created equal and should not be treated in the same manner; a one-size fits all model doesn't work.

In this always-connected world, protection measures in the cloud need to focus on what really matters - the type of data, how it is used, and where it goes.

Data Classification
In order to adequately protect data in the cloud, organizations need to start considering how to classify data. One approach is to use a three-tier data protection model to cater to data of different sensitivities and relevance across industries. This model would include:

Tier 1, Regulated: Data subject to regulation, or data that carries with it proprietary, ethical, or privacy considerations such as personally identifiable information (PII). Unauthorized disclosure of regulated data may have serious adverse effects on an organization's reputation, resources, services, or individuals and requires the most stringent level of control.

Tier 2, Commercial: Industry-related, ecommerce or transactional and intellectual property data whose unauthorized disclosure may have moderately adverse effects on an organization's reputation, resources, services, or individuals. Commercial data requires a moderate level of security.

Tier 3, Collaborative: Collaborative and DevOps-type data that typically is publicly accessible, requires minimal security controls and poses little or no risk to the consuming organization's reputation, resources, or services.

Using this model, security teams can strategically partner with business users to understand requirements and determine the right approach for their organization. Small to mid-sized organizations, enterprises, and service providers can apply this model to begin classifying their data based on contextual attributes such as how the data will be accessed, stored, and transmitted. Once the data is classified, they can then apply appropriate data protection measures focused on protecting work streams and transactions that continue to evolve to enable business agility. Given that most of today's data breaches are a result of user-access issues, security considerations such as Identity and Access Management, Authorization, and Authentication are critical.

The Data Integrity Challenge
Understanding and classifying data is just a first step, albeit an important one. Organizations also need to determine how to ensure data integrity when the perimeter is amorphous and control of the endpoints and the data is diminished mobility and cloud services.

Business departments are increasingly encouraged to find efficient and innovative ways to generate new business. This requires identifying new applications and ways to support the business anywhere and anytime. Business users often make the decision to use the cloud before involving IT since they can get up and running in a fraction of the time and cost it would take to provision in house.

With this unprecedented change in operations and infrastructure comes an unprecedented need for ensuring data integrity - ultimately working through the life cycle of data that can, at any point, be within the confines of a company, out to a network of partners and suppliers, or floating in a cloud. The challenge in this fractured landscape is that the perimeter is amorphous, but legacy security solutions are not; designed for a time when there was a more well-defined perimeter. The result is that attackers now use various techniques to bypass traditional perimeter-based defenses and compromise data - be it through tampering, stealing, or leaking data. Point-in-time defenses are no longer sufficient.

To effectively protect data wherever it may be, defenses must go beyond simply blocking and detection to including capabilities such as data correlation, continuous data analysis, and retrospective action when data has been found to have been corrupted, tampered with, or exfiltrated.

A New Approach to Applying Controls
In order to protect the classes of data described earlier - regulated, commercial, and collaborative - security teams need a mix of policy, process, and technology controls. These controls should be applied based on user and location context and according to a security model that is open, integrated, continuous, and pervasive:

  • Open to provide access to global intelligence and context to detect and remediate breaches and to support new standards for data protection.
  • Integrated solutions that enable policy to be automated and minimize manual processes can close gaps in security and support centralized management and control according to data classifications.
  • Both point-in-time solutions as well as continuous capabilities are needed to identify new threats to data.
  • Pervasive security delivers protection across the full attack continuum - before, during, and after an attack.

Let's take a closer look at the advantages of applying controls to protect data based on this model.

Openness provides:

  • The opportunity to participate in an open community of users and standards bodies to ensure consistent data classification and standards of policy and process.
  • Easy integration with other layers of security defenses to continue to uphold data protection best practices as IT environments and business requirements change.
  • The ability to access to global intelligence with the right context to identify new threats and take immediate action.

Integrated enables:

  • Technology controls that map to data tiers and also track data through different usage contexts and locations to support the fundamental first step of data classification.
  • Identity and access controls, authorization, and authentication that work in unison to map data protection to data classifications.
  • Encryption controls applied based on deemed data sensitivity to further strengthen protection, including strong encryption key standards (minimum AES256) and encryption keys retained by data owners.
  • Security solutions and technologies that seamlessly work together to protect data across its entire lifecycle.
  • Centralized policy management, monitoring, and distributed policy enforcement to ensure compliance with regulatory and corporate policies.

Continuous supports:

  • Technologies and services to constantly aggregate and correlate data from across the connected environment with historical patterns and global attack intelligence to maintain real-time contextual information, track data movement, and detect data exfiltration.
  • The ability to leverage insights into emerging new threats, take action (automatically or manually) to stop these threats, and use that intelligence to protect against future data breaches.

Pervasive translates into:

  • Defenses (including technologies and best practices) that address the full attack continuum - before, during, and after an attack. Before an attack, total, actionable visibility is required to see who is accessing what data from where and how, and to correlate that information against emerging threat vectors. During an attack, continuous visibility and control to analyze and take action in real time to protect data is necessary. After an attack, the key is to mitigate the damage, remediate, quickly recover, and prevent similar, future data breaches, data tampering, or data corruption activities.
  • The ability to address all attack vectors - including network, endpoints, virtual, the cloud, email and Web - to mitigate risk associated with various communications channels that could be used by an attacker to compromise data.

Today's cloud-driven, always-connected world is enabling organizations to be very agile but it is also putting data integrity at risk. IT teams need to quickly adapt to this new way of doing business despite having less control of the endpoints and the data. Traditional data protection models fail due to their inability to discriminate one set of data from another. By putting in place protection measures based on the type of data, how it is used, and where it goes, and backed by a security model that is open, integrated, continuous, and pervasive, organizations can take advantage of new business opportunities the cloud affords without sacrificing data integrity.

More Stories By Raja Patel

Raja Patel is a Senior Director, Cloud Security Product Management, at Cisco, where he is responsible for the portfolio strategy and development of security solutions for Cisco's Security Business. His responsibilities include building solutions and managing operations associated with Cloud, Threat Intelligence, Web and Email Security. Raja has been at Cisco for 13 years and during this tenure he has product managed a broad portfolio of products within Cisco’s Enterprise Networking Business Group, developed and accelerated new consumption & business models such as Enterprise Licensing, and lead strategic initiatives to develop more agile business practices across Cisco.

Mr. Patel holds a BS in Aerospace Engineering with a Minor in Mathematics from Embry Riddle Aeronautical University, and an MBA in Global Business Management.

Comments (0)

Share your thoughts on this story.

Add your comment
You must be signed in to add a comment. Sign-in | Register

In accordance with our Comment Policy, we encourage comments that are on topic, relevant and to-the-point. We will remove comments that include profanity, personal attacks, racial slurs, threats of violence, or other inappropriate material that violates our Terms and Conditions, and will block users who make repeated violations. We ask all readers to expect diversity of opinion and to treat one another with dignity and respect.


@CloudExpo Stories
Container frameworks, such as Docker, provide a variety of benefits, including density of deployment across infrastructure, convenience for application developers to push updates with low operational hand-holding, and a fairly well-defined deployment workflow that can be orchestrated. Container frameworks also enable a DevOps approach to application development by cleanly separating concerns between operations and development teams. But running multi-container, multi-server apps with containers ...
Software Development Solution category in The 2015 American Business Awards, and will ultimately be a Gold, Silver, or Bronze Stevie® Award winner in the program. More than 3,300 nominations from organizations of all sizes and in virtually every industry were submitted this year for consideration. "We are honored to be recognized as a leader in the software development industry by the Stevie Awards judges," said Steve Brodie, CEO of Electric Cloud. "We introduced ElectricFlow and our Deploy app...
What do a firewall and a fortress have in common? They are no longer strong enough to protect the valuables housed inside. Like the walls of an old fortress, the cracks in the firewall are allowing the bad guys to slip in - unannounced and unnoticed. By the time these thieves get in, the damage is already done and the network is already compromised. Intellectual property is easily slipped out the back door leaving no trace of forced entry. If we want to reign in on these cybercriminals, it's hig...
SYS-CON Events announced today that DragonGlass, an enterprise search platform, will exhibit at SYS-CON's 16th International Cloud Expo®, which will take place on June 9-11, 2015, at the Javits Center in New York City, NY. After eleven years of designing and building custom applications, OpenCrowd has launched DragonGlass, a cloud-based platform that enables the development of search-based applications. These are a new breed of applications that utilize a search index as their backbone for data...
Converging digital disruptions is creating a major sea change - Cisco calls this the Internet of Everything (IoE). IoE is the network connection of People, Process, Data and Things, fueled by Cloud, Mobile, Social, Analytics and Security, and it represents a $19Trillion value-at-stake over the next 10 years. In her keynote at @ThingsExpo, Manjula Talreja, VP of Cisco Consulting Services, will discuss IoE and the enormous opportunities it provides to public and private firms alike. She will shar...
The time is ripe for high speed resilient software defined storage solutions with unlimited scalability. ISS has been working with the leading open source projects and developed a commercial high performance solution that is able to grow forever without performance limitations. In his session at DevOps Summit, Alex Gorbachev, President of Intelligent Systems Services Inc., will share foundation principles of Ceph architecture, as well as the design to deliver this storage to traditional SAN st...
In their general session at 16th Cloud Expo, Michael Piccininni, Global Account Manager – Cloud SP at EMC Corporation, and Mike Dietze, Regional Director at Windstream Hosted Solutions, will review next generation cloud services, including the Windstream-EMC Tier Storage solutions, and discuss how to increase efficiencies, improve service delivery and enhance corporate cloud solution development. Speaker Bios Michael Piccininni is Global Account Manager – Cloud SP at EMC Corporation. He has b...
With major technology companies and startups seriously embracing IoT strategies, now is the perfect time to attend @ThingsExpo in Silicon Valley. Learn what is going on, contribute to the discussions, and ensure that your enterprise is as "IoT-Ready" as it can be! Internet of @ThingsExpo, taking place Nov 3-5, 2015, at the Santa Clara Convention Center in Santa Clara, CA, is co-located with 17th Cloud Expo and will feature technical sessions from a rock star conference faculty and the leading in...
Fundamentally, SDN is still mostly about network plumbing. While plumbing may be useful to tinker with, what you can do with your plumbing is far more intriguing. A rigid interpretation of SDN confines it to Layers 2 and 3, and that's reasonable. But SDN opens opportunities for novel constructions in Layers 4 to 7 that solve real operational problems in data centers. "Data center," in fact, might become anachronistic - data is everywhere, constantly on the move, seemingly always overflowing. Net...
The security devil is always in the details of the attack: the ones you've endured, the ones you prepare yourself to fend off, and the ones that, you fear, will catch you completely unaware and defenseless. The Internet of Things (IoT) is nothing if not an endless proliferation of details. It's the vision of a world in which continuous Internet connectivity and addressability is embedded into a growing range of human artifacts, into the natural world, and even into our smartphones, appliances, a...
Why does developer experience matters, what makes for a great developer experience and what is the relationship between developer experience and the broader field of user experience? Software developers are gaining more influence over the purchase decisions of technologies with which they must build on and with which they must integrate. For example, the success of Amazon Web Services, Heroku and MongoDB has been driven primarily by individual software developers choosing to use these tools, ra...
Software Defined Storage provides many benefits for customers including agility, flexibility, faster adoption of new technology and cost effectiveness. However, for IT organizations it can be challenging and complex to build your Enterprise Grade Storage from software. In his session at Cloud Expo, Paul Turner, CMO at Cloudian, looked at the new Original Design Manufacturer (ODM) market and how it is changing the storage world. Now Software Defined Storage companies can build Enterprise grade ...
SYS-CON Events announced today that EnterpriseDB (EDB), the leading worldwide provider of enterprise-class Postgres products and database compatibility solutions, will exhibit at SYS-CON's 16th International Cloud Expo®, which will take place on June 9-11, 2015, at the Javits Center in New York City, NY. EDB is the largest provider of Postgres software and services that provides enterprise-class performance and scalability and the open source freedom to divert budget from more costly traditiona...
T-Mobile has been transforming the wireless industry with its “Uncarrier” initiatives. Today as T-Mobile’s IT organization works to transform itself in a like manner, technical foundations built over the last couple of years are now key to their drive for more Agile delivery practices. In his session at DevOps Summit, Martin Krienke, Sr Development Manager at T-Mobile, will discuss where they started their Continuous Delivery journey, where they are today, and where they are going in an effort ...
Gartner predicts that the bulk of new IT spending by 2016 will be for cloud platforms and applications and that nearly half of large enterprises will have cloud deployments by the end of 2017. The benefits of the cloud may be clear for applications that can tolerate brief periods of downtime, but for critical applications like SQL Server, Oracle and SAP, companies need a strategy for HA and DR protection. While traditional SAN-based clusters are not possible in these environments, SANless cluste...
In a recent research, analyst firm IDC found that the average cost of a critical application failure is $500,000 to $1 million per hour and the average total cost of unplanned application downtime is $1.25 billion to $2.5 billion per year for Fortune 1000 companies. In addition to the findings on the cost of the downtime, the research also highlighted best practices for development, testing, application support, infrastructure, and operations teams.
The OpenStack cloud operating system includes Trove, a database abstraction layer. Rather than applications connecting directly to a specific type of database, they connect to Trove, which in turn connects to one or more specific databases. One target database is Postgres Plus Cloud Database, which includes its own RESTful API. Trove was originally developed around MySQL, whose interfaces are significantly less complicated than those of the Postgres cloud database. In his session at 16th Cloud...
IoT is still a vague buzzword for many people. In his session at @ThingsExpo, Mike Kavis, Vice President & Principal Cloud Architect at Cloud Technology Partners, discussed the business value of IoT that goes far beyond the general public's perception that IoT is all about wearables and home consumer services. He also discussed how IoT is perceived by investors and how venture capitalist access this space. Other topics discussed were barriers to success, what is new, what is old, and what th...
To manage complex web services with lots of calls to the cloud, many businesses have invested in Application Performance Management (APM) and Network Performance Management (NPM) tools. Together APM and NPM tools are essential aids in improving a business's infrastructure required to support an effective web experience... but they are missing a critical component - Internet visibility. Internet connectivity has always played a role in customer access to web presence, but in the past few years u...
There's Big Data, then there's really Big Data from the Internet of Things. IoT is evolving to include many data possibilities like new types of event, log and network data. The volumes are enormous, generating tens of billions of logs per day, which raise data challenges. Early IoT deployments are relying heavily on both the cloud and managed service providers to navigate these challenges. In her session at Big Data Expo®, Hannah Smalltree, Director at Treasure Data, discussed how IoT, Big D...