Welcome!

@CloudExpo Authors: Elizabeth White, Carmen Gonzalez, Pat Romanski, Liz McMillan, Harry Trott

Related Topics: @CloudExpo, Java IoT, Microservices Expo, Cloud Security, @BigDataExpo, SDN Journal

@CloudExpo: Article

Don't Stick Your Head in the Sand, Create a Proactive Security Strategy

Preventing data leakage from the cloud

In business, data is currency. It is the oil that keeps the commercial engine in motion and databases are the digital banks that store and retrieve this valuable information. And, according to IDC, data is doubling every two years. But as the overall amount of data grows, so does the amount of sensitive and regulated data. All this data stored by enterprises requires high levels of security. Presently (again, according to IDC) only about a quarter of that data is being properly protected now. Like all currency, data must be protected.

And herein lays a key issue. Too many executives see security as a cost center and are often reticent to invest beyond the bare minimum--whatever keeps the nasty viruses out; whatever is absolutely necessary for compliance. Their thought process is akin to “we haven’t been attacked before…or we don't have a high enough profile for hackers to care” I call this “ostriching” – putting your head in the sand and hoping misfortune never darkens your door.

To substantiate this attitude many organizations look toward on premise-based protection that encrypts or monitors network traffic containing critical information. For the average company, this can be a budget buster and a significant resource drain...that is until they look toward the cloud and explore cloud-based security options.

Yet regardless of deployment options, most security experts will agree the best defense is a proactive strategy.

Data leak prevention (DLP), like most security efforts, is a complex challenge. It is meant to prevent the deliberate and inadvertent release of sensitive information. Too many companies are trying to cure the symptoms rather than prevent them in the first place.

Part of the protection equation is being overlooked. Database management systems must also be a component of a proactive data security strategy. Like the bank vault, it requires strong protections at its foundation. DLP is one part of a comprehensive enterprise data security program that includes comprehensive security best practices for the protection of mission-critical enterprise data repositories. The security must be able to both foil attackers who are financially motivated and won't be deterred by minimalist security and prevent the accidental release of data. Data security will go nowhere without robust, proactive database security.

To properly achieve these goals, organizations need to implement functions that comprise of a variety of solutions. And when used cooperatively, a company can instantly discover who is doing what and when on the network, identify the potential impact and take the necessary steps to prevent or allow access/usage. Just like a bank vault—security cameras follow you to see who you are, you need a password  to get into the vault itself (during business hours!) and your only allowed to open your own safety deposit box (as long as you have the key). Here are four proactive measures you can take:

Intrusion detection (security information and event monitoring): The first step in protection is to know who is proverbially knocking on the door…or sneaking around the back entrance. Activity monitoring and blocking is the first line of defense for your firewall and beyond (this includes BYOD access. And vigilance on the front lines create real time correlation to detect patterns of traffic, spot usage anomalies and prevent internal or external attacks. SIEM actually provides the forensic analysis that determines whether or not any access of a network is friendly/permissible, suspicious or threatening. This analysis is the basis of creating alerts to take appropriate action/alerts to prevent data leakage.

Traffic monitoring (Log Management): Once you know who’s accessing the network, log management looks to make sense of the patterns and historical usage so one can identify suspicious IP addresses, locations, and users as likely transgressors. If you can predict the traffic, then you can create the rules to block sources, prevent access and create a reportable audit trail of activity. But to be proactive, it must be continuous and in real time.  Looking at reams of machine logs days or weeks after might discover breaches and careless users, but it can’t prevent it. It is the proverbial equivalent of chasing the horse that has left the barn.

Provisioning: (Identity Management): One of the best ways of ensuring users only access data to which they are entitled to see or use is through proper delegation of user rights. This is handled through identity management provisioning. In well too many documented cases, a user (typically an employee) leaves the fold, but never relinquishes access to this sensitive information. Just as provisioning gives users certain rights, automatic de-provsioning keeps former employees and other away from certain sections of your database. And when connected to SIEM and Log Management, when and if deprovsioned users try to use retired passwords or accounts, you know about it when it happens!

Authentication and Credentialing: (Access Management) This is more than password management (and making sure these codes are more substantial than “password123” B making sure access is controlled by at least two or more credentialing (multi-factored authentication) For example, a hospital may choose to require authorized personnel to present a log in credentials like a password and a unique variable code to access certain protected/sensitive areas of the network or database. In doing so, they have additional protection against the use of lost or unauthorized credentials. It is another layer of protection that can deflect potential data leakage.

In this assessment, there are at least four individual solutions which require implementation and monitoring. If the executives were unwilling before, how can an IT department muster the leverage to find money or the proposed staffing to deploy this preventive strategy? The good news is they don’t have to do either. Through a unified security model (real time event and access correlation technology) from the cloud combines the capabilities and functionalities from each of these toolsets and creates a strong, cost-effective enterprise platform. It leverages the key features in a single cooperative, centralized  source that enhances visibility throughout the enterprise. All the cost saving benefits inherent with cloud computing are realized and as a security-as-a-service, the need for additional headcount is moot. Part of the service is the live expert analysts watching over your virtual borders 24/7/365.

The additional benefit it’s the ability to leverage existing programs into a REACT platform. If a company previously invested in a Log Management or Single Sign On solution, they can easily integrate the other pieces of the puzzle to ensure a layered, holistic approach. This way all the independent silos are monitored and covered. Because each of the solutions interact and intersect with one another, the seamless communication creates a layered, responsive defense that anticipates, controls and alerts as opposed attempting to put the toothpaste back into the tube. The damage of a breach (whether through user carelessness, internal sabotage or direct attack) is more than just the compliance fines and the blowback of the data currency affected. Substantial and detrimentally impactful as they are, they can’t touch the cost of broken trust. That, in itself, is a driving reason to get ahead on the issue of proactive security.

As enterprise systems are exposed to substantial risk from data loss, theft, or manipulation, unified security platforms from the cloud IS that fine balance of data leakage prevention, protection of IP assets, maintenance of compliance standards versus cost/resource responsibility. It is an accountable way of becoming proactive.

Kevin Nikkhoo

CloudAccess

More Stories By Kevin Nikkhoo

With more than 32 years of experience in information technology, and an extensive and successful entrepreneurial background, Kevin Nikkhoo is the CEO of the dynamic security-as-a-service startup Cloud Access. CloudAccess is at the forefront of the latest evolution of IT asset protection--the cloud.

Kevin holds a Bachelor of Science in Computer Engineering from McGill University, Master of Computer Engineering at California State University, Los Angeles, and an MBA from the University of Southern California with emphasis in entrepreneurial studies.

@CloudExpo Stories
More and more companies are looking to microservices as an architectural pattern for breaking apart applications into more manageable pieces so that agile teams can deliver new features quicker and more effectively. What this pattern has done more than anything to date is spark organizational transformations, setting the foundation for future application development. In practice, however, there are a number of considerations to make that go beyond simply “build, ship, and run,” which changes how...
WebRTC is the future of browser-to-browser communications, and continues to make inroads into the traditional, difficult, plug-in web communications world. The 6th WebRTC Summit continues our tradition of delivering the latest and greatest presentations within the world of WebRTC. Topics include voice calling, video chat, P2P file sharing, and use cases that have already leveraged the power and convenience of WebRTC.
Amazon has gradually rolled out parts of its IoT offerings, but these are just the tip of the iceberg. In addition to optimizing their backend AWS offerings, Amazon is laying the ground work to be a major force in IoT - especially in the connected home and office. In his session at @ThingsExpo, Chris Kocher, founder and managing director of Grey Heron, explained how Amazon is extending its reach to become a major force in IoT by building on its dominant cloud IoT platform, its Dash Button strat...
Let’s face it, embracing new storage technologies, capabilities and upgrading to new hardware often adds complexity and increases costs. In his session at 18th Cloud Expo, Seth Oxenhorn, Vice President of Business Development & Alliances at FalconStor, discussed how a truly heterogeneous software-defined storage approach can add value to legacy platforms and heterogeneous environments. The result reduces complexity, significantly lowers cost, and provides IT organizations with improved efficienc...
Internet-of-Things discussions can end up either going down the consumer gadget rabbit hole or focused on the sort of data logging that industrial manufacturers have been doing forever. However, in fact, companies today are already using IoT data both to optimize their operational technology and to improve the experience of customer interactions in novel ways. In his session at @ThingsExpo, Gordon Haff, Red Hat Technology Evangelist, will share examples from a wide range of industries – includin...
"We build IoT infrastructure products - when you have to integrate different devices, different systems and cloud you have to build an application to do that but we eliminate the need to build an application. Our products can integrate any device, any system, any cloud regardless of protocol," explained Peter Jung, Chief Product Officer at Pulzze Systems, in this SYS-CON.tv interview at @ThingsExpo, held November 1-3, 2016, at the Santa Clara Convention Center in Santa Clara, CA.
When it comes to cloud computing, the ability to turn massive amounts of compute cores on and off on demand sounds attractive to IT staff, who need to manage peaks and valleys in user activity. With cloud bursting, the majority of the data can stay on premises while tapping into compute from public cloud providers, reducing risk and minimizing need to move large files. In his session at 18th Cloud Expo, Scott Jeschonek, Director of Product Management at Avere Systems, discussed the IT and busin...
Between 2005 and 2020, data volumes will grow by a factor of 300 – enough data to stack CDs from the earth to the moon 162 times. This has come to be known as the ‘big data’ phenomenon. Unfortunately, traditional approaches to handling, storing and analyzing data aren’t adequate at this scale: they’re too costly, slow and physically cumbersome to keep up. Fortunately, in response a new breed of technology has emerged that is cheaper, faster and more scalable. Yet, in meeting these new needs they...
The cloud promises new levels of agility and cost-savings for Big Data, data warehousing and analytics. But it’s challenging to understand all the options – from IaaS and PaaS to newer services like HaaS (Hadoop as a Service) and BDaaS (Big Data as a Service). In her session at @BigDataExpo at @ThingsExpo, Hannah Smalltree, a director at Cazena, provided an educational overview of emerging “as-a-service” options for Big Data in the cloud. This is critical background for IT and data professionals...
"Once customers get a year into their IoT deployments, they start to realize that they may have been shortsighted in the ways they built out their deployment and the key thing I see a lot of people looking at is - how can I take equipment data, pull it back in an IoT solution and show it in a dashboard," stated Dave McCarthy, Director of Products at Bsquare Corporation, in this SYS-CON.tv interview at @ThingsExpo, held November 1-3, 2016, at the Santa Clara Convention Center in Santa Clara, CA.
@DevOpsSummit taking place June 6-8, 2017 at Javits Center, New York City, is co-located with the 20th International Cloud Expo and will feature technical sessions from a rock star conference faculty and the leading industry players in the world. @DevOpsSummit at Cloud Expo New York Call for Papers is now open.
The cloud competition for database hosts is fierce. How do you evaluate a cloud provider for your database platform? In his session at 18th Cloud Expo, Chris Presley, a Solutions Architect at Pythian, gave users a checklist of considerations when choosing a provider. Chris Presley is a Solutions Architect at Pythian. He loves order – making him a premier Microsoft SQL Server expert. Not only has he programmed and administered SQL Server, but he has also shared his expertise and passion with b...
In his session at 19th Cloud Expo, Claude Remillard, Principal Program Manager in Developer Division at Microsoft, contrasted how his team used config as code and immutable patterns for continuous delivery of microservices and apps to the cloud. He showed how the immutable patterns helps developers do away with most of the complexity of config as code-enabling scenarios such as rollback, zero downtime upgrades with far greater simplicity. He also demoed building immutable pipelines in the cloud ...
As data explodes in quantity, importance and from new sources, the need for managing and protecting data residing across physical, virtual, and cloud environments grow with it. Managing data includes protecting it, indexing and classifying it for true, long-term management, compliance and E-Discovery. Commvault can ensure this with a single pane of glass solution – whether in a private cloud, a Service Provider delivered public cloud or a hybrid cloud environment – across the heterogeneous enter...
"IoT is going to be a huge industry with a lot of value for end users, for industries, for consumers, for manufacturers. How can we use cloud to effectively manage IoT applications," stated Ian Khan, Innovation & Marketing Manager at Solgeniakhela, in this SYS-CON.tv interview at @ThingsExpo, held November 3-5, 2015, at the Santa Clara Convention Center in Santa Clara, CA.
Today we can collect lots and lots of performance data. We build beautiful dashboards and even have fancy query languages to access and transform the data. Still performance data is a secret language only a couple of people understand. The more business becomes digital the more stakeholders are interested in this data including how it relates to business. Some of these people have never used a monitoring tool before. They have a question on their mind like “How is my application doing” but no id...
@GonzalezCarmen has been ranked the Number One Influencer and @ThingsExpo has been named the Number One Brand in the “M2M 2016: Top 100 Influencers and Brands” by Onalytica. Onalytica analyzed tweets over the last 6 months mentioning the keywords M2M OR “Machine to Machine.” They then identified the top 100 most influential brands and individuals leading the discussion on Twitter.
In IT, we sometimes coin terms for things before we know exactly what they are and how they’ll be used. The resulting terms may capture a common set of aspirations and goals – as “cloud” did broadly for on-demand, self-service, and flexible computing. But such a term can also lump together diverse and even competing practices, technologies, and priorities to the point where important distinctions are glossed over and lost.
Predictive analytics tools monitor, report, and troubleshoot in order to make proactive decisions about the health, performance, and utilization of storage. Most enterprises combine cloud and on-premise storage, resulting in blended environments of physical, virtual, cloud, and other platforms, which justifies more sophisticated storage analytics. In his session at 18th Cloud Expo, Peter McCallum, Vice President of Datacenter Solutions at FalconStor, discussed using predictive analytics to mon...
All clouds are not equal. To succeed in a DevOps context, organizations should plan to develop/deploy apps across a choice of on-premise and public clouds simultaneously depending on the business needs. This is where the concept of the Lean Cloud comes in - resting on the idea that you often need to relocate your app modules over their life cycles for both innovation and operational efficiency in the cloud. In his session at @DevOpsSummit at19th Cloud Expo, Valentin (Val) Bercovici, CTO of Soli...