Welcome!

@CloudExpo Authors: Elizabeth White, Liz McMillan, Pat Romanski, ManageEngine IT Matters, Cloud Best Practices Network

Related Topics: @CloudExpo, Java IoT, Microservices Expo, Cloud Security, @BigDataExpo, SDN Journal

@CloudExpo: Article

Don't Stick Your Head in the Sand, Create a Proactive Security Strategy

Preventing data leakage from the cloud

In business, data is currency. It is the oil that keeps the commercial engine in motion and databases are the digital banks that store and retrieve this valuable information. And, according to IDC, data is doubling every two years. But as the overall amount of data grows, so does the amount of sensitive and regulated data. All this data stored by enterprises requires high levels of security. Presently (again, according to IDC) only about a quarter of that data is being properly protected now. Like all currency, data must be protected.

And herein lays a key issue. Too many executives see security as a cost center and are often reticent to invest beyond the bare minimum--whatever keeps the nasty viruses out; whatever is absolutely necessary for compliance. Their thought process is akin to “we haven’t been attacked before…or we don't have a high enough profile for hackers to care” I call this “ostriching” – putting your head in the sand and hoping misfortune never darkens your door.

To substantiate this attitude many organizations look toward on premise-based protection that encrypts or monitors network traffic containing critical information. For the average company, this can be a budget buster and a significant resource drain...that is until they look toward the cloud and explore cloud-based security options.

Yet regardless of deployment options, most security experts will agree the best defense is a proactive strategy.

Data leak prevention (DLP), like most security efforts, is a complex challenge. It is meant to prevent the deliberate and inadvertent release of sensitive information. Too many companies are trying to cure the symptoms rather than prevent them in the first place.

Part of the protection equation is being overlooked. Database management systems must also be a component of a proactive data security strategy. Like the bank vault, it requires strong protections at its foundation. DLP is one part of a comprehensive enterprise data security program that includes comprehensive security best practices for the protection of mission-critical enterprise data repositories. The security must be able to both foil attackers who are financially motivated and won't be deterred by minimalist security and prevent the accidental release of data. Data security will go nowhere without robust, proactive database security.

To properly achieve these goals, organizations need to implement functions that comprise of a variety of solutions. And when used cooperatively, a company can instantly discover who is doing what and when on the network, identify the potential impact and take the necessary steps to prevent or allow access/usage. Just like a bank vault—security cameras follow you to see who you are, you need a password  to get into the vault itself (during business hours!) and your only allowed to open your own safety deposit box (as long as you have the key). Here are four proactive measures you can take:

Intrusion detection (security information and event monitoring): The first step in protection is to know who is proverbially knocking on the door…or sneaking around the back entrance. Activity monitoring and blocking is the first line of defense for your firewall and beyond (this includes BYOD access. And vigilance on the front lines create real time correlation to detect patterns of traffic, spot usage anomalies and prevent internal or external attacks. SIEM actually provides the forensic analysis that determines whether or not any access of a network is friendly/permissible, suspicious or threatening. This analysis is the basis of creating alerts to take appropriate action/alerts to prevent data leakage.

Traffic monitoring (Log Management): Once you know who’s accessing the network, log management looks to make sense of the patterns and historical usage so one can identify suspicious IP addresses, locations, and users as likely transgressors. If you can predict the traffic, then you can create the rules to block sources, prevent access and create a reportable audit trail of activity. But to be proactive, it must be continuous and in real time.  Looking at reams of machine logs days or weeks after might discover breaches and careless users, but it can’t prevent it. It is the proverbial equivalent of chasing the horse that has left the barn.

Provisioning: (Identity Management): One of the best ways of ensuring users only access data to which they are entitled to see or use is through proper delegation of user rights. This is handled through identity management provisioning. In well too many documented cases, a user (typically an employee) leaves the fold, but never relinquishes access to this sensitive information. Just as provisioning gives users certain rights, automatic de-provsioning keeps former employees and other away from certain sections of your database. And when connected to SIEM and Log Management, when and if deprovsioned users try to use retired passwords or accounts, you know about it when it happens!

Authentication and Credentialing: (Access Management) This is more than password management (and making sure these codes are more substantial than “password123” B making sure access is controlled by at least two or more credentialing (multi-factored authentication) For example, a hospital may choose to require authorized personnel to present a log in credentials like a password and a unique variable code to access certain protected/sensitive areas of the network or database. In doing so, they have additional protection against the use of lost or unauthorized credentials. It is another layer of protection that can deflect potential data leakage.

In this assessment, there are at least four individual solutions which require implementation and monitoring. If the executives were unwilling before, how can an IT department muster the leverage to find money or the proposed staffing to deploy this preventive strategy? The good news is they don’t have to do either. Through a unified security model (real time event and access correlation technology) from the cloud combines the capabilities and functionalities from each of these toolsets and creates a strong, cost-effective enterprise platform. It leverages the key features in a single cooperative, centralized  source that enhances visibility throughout the enterprise. All the cost saving benefits inherent with cloud computing are realized and as a security-as-a-service, the need for additional headcount is moot. Part of the service is the live expert analysts watching over your virtual borders 24/7/365.

The additional benefit it’s the ability to leverage existing programs into a REACT platform. If a company previously invested in a Log Management or Single Sign On solution, they can easily integrate the other pieces of the puzzle to ensure a layered, holistic approach. This way all the independent silos are monitored and covered. Because each of the solutions interact and intersect with one another, the seamless communication creates a layered, responsive defense that anticipates, controls and alerts as opposed attempting to put the toothpaste back into the tube. The damage of a breach (whether through user carelessness, internal sabotage or direct attack) is more than just the compliance fines and the blowback of the data currency affected. Substantial and detrimentally impactful as they are, they can’t touch the cost of broken trust. That, in itself, is a driving reason to get ahead on the issue of proactive security.

As enterprise systems are exposed to substantial risk from data loss, theft, or manipulation, unified security platforms from the cloud IS that fine balance of data leakage prevention, protection of IP assets, maintenance of compliance standards versus cost/resource responsibility. It is an accountable way of becoming proactive.

Kevin Nikkhoo

CloudAccess

More Stories By Kevin Nikkhoo

With more than 32 years of experience in information technology, and an extensive and successful entrepreneurial background, Kevin Nikkhoo is the CEO of the dynamic security-as-a-service startup Cloud Access. CloudAccess is at the forefront of the latest evolution of IT asset protection--the cloud.

Kevin holds a Bachelor of Science in Computer Engineering from McGill University, Master of Computer Engineering at California State University, Los Angeles, and an MBA from the University of Southern California with emphasis in entrepreneurial studies.

@CloudExpo Stories
"Plutora provides release and testing environment capabilities to the enterprise," explained Dalibor Siroky, Director and Co-founder of Plutora, in this SYS-CON.tv interview at @DevOpsSummit, held June 9-11, 2015, at the Javits Center in New York City.
DevOps tends to focus on the relationship between Dev and Ops, putting an emphasis on the ops and application infrastructure. But that’s changing with microservices architectures. In her session at DevOps Summit, Lori MacVittie, Evangelist for F5 Networks, will focus on how microservices are changing the underlying architectures needed to scale, secure and deliver applications based on highly distributed (micro) services and why that means an expansion into “the network” for DevOps.
SYS-CON Events announced today that Enzu will exhibit at SYS-CON's 20th International Cloud Expo®, which will take place on June 6-8, 2017, at the Javits Center in New York City, NY, and the 21st International Cloud Expo®, which will take place October 31-November 2, 2017, at the Santa Clara Convention Center in Santa Clara, CA. Enzu’s mission is to be the leading provider of enterprise cloud solutions worldwide. Enzu enables online businesses to use its IT infrastructure to their competitive ad...
"We are an all-flash array storage provider but our focus has been on VM-aware storage specifically for virtualized applications," stated Dhiraj Sehgal of Tintri in this SYS-CON.tv interview at 19th Cloud Expo, held November 1-3, 2016, at the Santa Clara Convention Center in Santa Clara, CA.
Choosing the right cloud for your workloads is a balancing act that can cost your organization time, money and aggravation - unless you get it right the first time. Economics, speed, performance, accessibility, administrative needs and security all play a vital role in dictating your approach to the cloud. Without knowing the right questions to ask, you could wind up paying for capacity you'll never need or underestimating the resources required to run your applications.
WebRTC is about the data channel as much as about video and audio conferencing. However, basically all commercial WebRTC applications have been built with a focus on audio and video. The handling of “data” has been limited to text chat and file download – all other data sharing seems to end with screensharing. What is holding back a more intensive use of peer-to-peer data? In her session at @ThingsExpo, Dr Silvia Pfeiffer, WebRTC Applications Team Lead at National ICT Australia, looked at differ...
Adding public cloud resources to an existing application can be a daunting process. The tools that you currently use to manage the software and hardware outside the cloud aren’t always the best tools to efficiently grow into the cloud. All of the major configuration management tools have cloud orchestration plugins that can be leveraged, but there are also cloud-native tools that can dramatically improve the efficiency of managing your application lifecycle. In his session at 18th Cloud Expo, ...
With the proliferation of both SQL and NoSQL databases, organizations can now target specific fit-for-purpose database tools for their different application needs regarding scalability, ease of use, ACID support, etc. Platform as a Service offerings make this even easier now, enabling developers to roll out their own database infrastructure in minutes with minimal management overhead. However, this same amount of flexibility also comes with the challenges of picking the right tool, on the right ...
Security, data privacy, reliability and regulatory compliance are critical factors when evaluating whether to move business applications from in-house client hosted environments to a cloud platform. In her session at 18th Cloud Expo, Vandana Viswanathan, Associate Director at Cognizant, In this session, will provide an orientation to the five stages required to implement a cloud hosted solution validation strategy.
The security needs of IoT environments require a strong, proven approach to maintain security, trust and privacy in their ecosystem. Assurance and protection of device identity, secure data encryption and authentication are the key security challenges organizations are trying to address when integrating IoT devices. This holds true for IoT applications in a wide range of industries, for example, healthcare, consumer devices, and manufacturing. In his session at @ThingsExpo, Lancen LaChance, vic...
With all the incredible momentum behind the Internet of Things (IoT) industry, it is easy to forget that not a single CEO wakes up and wonders if “my IoT is broken.” What they wonder is if they are making the right decisions to do all they can to increase revenue, decrease costs, and improve customer experience – effectively the same challenges they have always had in growing their business. The exciting thing about the IoT industry is now these decisions can be better, faster, and smarter. Now ...
"Splunk basically takes machine data and we make it usable, valuable and accessible for everyone. The way that plays in DevOps is - we need to make data-driven decisions to delivering applications," explained Andi Mann, Chief Technology Advocate at Splunk and @DevOpsSummit Conference Chair, in this SYS-CON.tv interview at @DevOpsSummit at 19th Cloud Expo, held November 1-3, 2016, at the Santa Clara Convention Center in Santa Clara, CA.
Security, data privacy, reliability, and regulatory compliance are critical factors when evaluating whether to move business applications from in-house, client-hosted environments to a cloud platform. Quality assurance plays a vital role in ensuring that the appropriate level of risk assessment, verification, and validation takes place to ensure business continuity during the migration to a new cloud platform.
Who are you? How do you introduce yourself? Do you use a name, or do you greet a friend by the last four digits of his social security number? Assuming you don’t, why are we content to associate our identity with 10 random digits assigned by our phone company? Identity is an issue that affects everyone, but as individuals we don’t spend a lot of time thinking about it. In his session at @ThingsExpo, Ben Klang, Founder & President of Mojo Lingo, discussed the impact of technology on identity. Sho...
A critical component of any IoT project is what to do with all the data being generated. This data needs to be captured, processed, structured, and stored in a way to facilitate different kinds of queries. Traditional data warehouse and analytical systems are mature technologies that can be used to handle certain kinds of queries, but they are not always well suited to many problems, particularly when there is a need for real-time insights.
In his session at DevOps Summit, Tapabrata Pal, Director of Enterprise Architecture at Capital One, will tell a story about how Capital One has embraced Agile and DevOps Security practices across the Enterprise – driven by Enterprise Architecture; bringing in Development, Operations and Information Security organizations together. Capital Ones DevOpsSec practice is based upon three "pillars" – Shift-Left, Automate Everything, Dashboard Everything. Within about three years, from 100% waterfall, C...
WebRTC has had a real tough three or four years, and so have those working with it. Only a few short years ago, the development world were excited about WebRTC and proclaiming how awesome it was. You might have played with the technology a couple of years ago, only to find the extra infrastructure requirements were painful to implement and poorly documented. This probably left a bitter taste in your mouth, especially when things went wrong.
"We provide DevOps solutions. We also partner with some key players in the DevOps space and we use the technology that we partner with to engineer custom solutions for different organizations," stated Himanshu Chhetri, CTO of Addteq, in this SYS-CON.tv interview at DevOps at 18th Cloud Expo, held June 7-9, 2016, at the Javits Center in New York City, NY.
WebRTC is bringing significant change to the communications landscape that will bridge the worlds of web and telephony, making the Internet the new standard for communications. Cloud9 took the road less traveled and used WebRTC to create a downloadable enterprise-grade communications platform that is changing the communication dynamic in the financial sector. In his session at @ThingsExpo, Leo Papadopoulos, CTO of Cloud9, discussed the importance of WebRTC and how it enables companies to focus o...
Web Real-Time Communication APIs have quickly revolutionized what browsers are capable of. In addition to video and audio streams, we can now bi-directionally send arbitrary data over WebRTC's PeerConnection Data Channels. With the advent of Progressive Web Apps and new hardware APIs such as WebBluetooh and WebUSB, we can finally enable users to stitch together the Internet of Things directly from their browsers while communicating privately and securely in a decentralized way.