|By Kevin Nikkhoo||
|January 18, 2013 09:45 AM EST||
In business, data is currency. It is the oil that keeps the commercial engine in motion and databases are the digital banks that store and retrieve this valuable information. And, according to IDC, data is doubling every two years. But as the overall amount of data grows, so does the amount of sensitive and regulated data. All this data stored by enterprises requires high levels of security. Presently (again, according to IDC) only about a quarter of that data is being properly protected now. Like all currency, data must be protected.
And herein lays a key issue. Too many executives see security as a cost center and are often reticent to invest beyond the bare minimum--whatever keeps the nasty viruses out; whatever is absolutely necessary for compliance. Their thought process is akin to “we haven’t been attacked before…or we don't have a high enough profile for hackers to care” I call this “ostriching” – putting your head in the sand and hoping misfortune never darkens your door.
To substantiate this attitude many organizations look toward on premise-based protection that encrypts or monitors network traffic containing critical information. For the average company, this can be a budget buster and a significant resource drain...that is until they look toward the cloud and explore cloud-based security options.
Yet regardless of deployment options, most security experts will agree the best defense is a proactive strategy.
Data leak prevention (DLP), like most security efforts, is a complex challenge. It is meant to prevent the deliberate and inadvertent release of sensitive information. Too many companies are trying to cure the symptoms rather than prevent them in the first place.
Part of the protection equation is being overlooked. Database management systems must also be a component of a proactive data security strategy. Like the bank vault, it requires strong protections at its foundation. DLP is one part of a comprehensive enterprise data security program that includes comprehensive security best practices for the protection of mission-critical enterprise data repositories. The security must be able to both foil attackers who are financially motivated and won't be deterred by minimalist security and prevent the accidental release of data. Data security will go nowhere without robust, proactive database security.
To properly achieve these goals, organizations need to implement functions that comprise of a variety of solutions. And when used cooperatively, a company can instantly discover who is doing what and when on the network, identify the potential impact and take the necessary steps to prevent or allow access/usage. Just like a bank vault—security cameras follow you to see who you are, you need a password to get into the vault itself (during business hours!) and your only allowed to open your own safety deposit box (as long as you have the key). Here are four proactive measures you can take:
Intrusion detection (security information and event monitoring): The first step in protection is to know who is proverbially knocking on the door…or sneaking around the back entrance. Activity monitoring and blocking is the first line of defense for your firewall and beyond (this includes BYOD access. And vigilance on the front lines create real time correlation to detect patterns of traffic, spot usage anomalies and prevent internal or external attacks. SIEM actually provides the forensic analysis that determines whether or not any access of a network is friendly/permissible, suspicious or threatening. This analysis is the basis of creating alerts to take appropriate action/alerts to prevent data leakage.
Traffic monitoring (Log Management): Once you know who’s accessing the network, log management looks to make sense of the patterns and historical usage so one can identify suspicious IP addresses, locations, and users as likely transgressors. If you can predict the traffic, then you can create the rules to block sources, prevent access and create a reportable audit trail of activity. But to be proactive, it must be continuous and in real time. Looking at reams of machine logs days or weeks after might discover breaches and careless users, but it can’t prevent it. It is the proverbial equivalent of chasing the horse that has left the barn.
Provisioning: (Identity Management): One of the best ways of ensuring users only access data to which they are entitled to see or use is through proper delegation of user rights. This is handled through identity management provisioning. In well too many documented cases, a user (typically an employee) leaves the fold, but never relinquishes access to this sensitive information. Just as provisioning gives users certain rights, automatic de-provsioning keeps former employees and other away from certain sections of your database. And when connected to SIEM and Log Management, when and if deprovsioned users try to use retired passwords or accounts, you know about it when it happens!
Authentication and Credentialing: (Access Management) This is more than password management (and making sure these codes are more substantial than “password123” B making sure access is controlled by at least two or more credentialing (multi-factored authentication) For example, a hospital may choose to require authorized personnel to present a log in credentials like a password and a unique variable code to access certain protected/sensitive areas of the network or database. In doing so, they have additional protection against the use of lost or unauthorized credentials. It is another layer of protection that can deflect potential data leakage.
In this assessment, there are at least four individual solutions which require implementation and monitoring. If the executives were unwilling before, how can an IT department muster the leverage to find money or the proposed staffing to deploy this preventive strategy? The good news is they don’t have to do either. Through a unified security model (real time event and access correlation technology) from the cloud combines the capabilities and functionalities from each of these toolsets and creates a strong, cost-effective enterprise platform. It leverages the key features in a single cooperative, centralized source that enhances visibility throughout the enterprise. All the cost saving benefits inherent with cloud computing are realized and as a security-as-a-service, the need for additional headcount is moot. Part of the service is the live expert analysts watching over your virtual borders 24/7/365.
The additional benefit it’s the ability to leverage existing programs into a REACT platform. If a company previously invested in a Log Management or Single Sign On solution, they can easily integrate the other pieces of the puzzle to ensure a layered, holistic approach. This way all the independent silos are monitored and covered. Because each of the solutions interact and intersect with one another, the seamless communication creates a layered, responsive defense that anticipates, controls and alerts as opposed attempting to put the toothpaste back into the tube. The damage of a breach (whether through user carelessness, internal sabotage or direct attack) is more than just the compliance fines and the blowback of the data currency affected. Substantial and detrimentally impactful as they are, they can’t touch the cost of broken trust. That, in itself, is a driving reason to get ahead on the issue of proactive security.
As enterprise systems are exposed to substantial risk from data loss, theft, or manipulation, unified security platforms from the cloud IS that fine balance of data leakage prevention, protection of IP assets, maintenance of compliance standards versus cost/resource responsibility. It is an accountable way of becoming proactive.
It's time to face reality: "Americans are from Mars, Europeans are from Venus," and in today's increasingly connected world, understanding "inter-planetary" alignments and deviations is mission-critical for cloud. In her session at 15th Cloud Expo, Evelyn de Souza, Data Privacy and Compliance Strategy Leader at Cisco Systems, discussed cultural expectations of privacy based on new research across these elements
Jan. 25, 2015 09:15 AM EST Reads: 3,163
Recurring revenue models are great for driving new business in every market sector, but they are complex and need to be effectively managed to maximize profits. How you handle the range of options for pricing, co-terming and proration will ultimately determine the fate of your bottom line. In his session at 15th Cloud Expo, Brendan O'Brien, Co-founder at Aria Systems, session examined: How time impacts recurring revenue How to effectively handle customer plan changes The range of pricing a...
Jan. 25, 2015 09:00 AM EST Reads: 3,830
Mobile, the cloud and data have upended traditional ways of doing business. Agile, continuous delivery and DevOps have stepped in to hasten product development, but one crucial process still hasn't caught up. Continuous content delivery is the missing limb of the success ecosystem. Currently workers spend countless, non-value add hours working in independent silos, hunting for versions, manually pushing documents between platforms, all while trying to manage the continuous update and flow of mul...
Jan. 25, 2015 09:00 AM EST Reads: 3,369
Over the last few years the healthcare ecosystem has revolved around innovations in Electronic Health Record (HER) based systems. This evolution has helped us achieve much desired interoperability. Now the focus is shifting to other equally important aspects - scalability and performance. While applying cloud computing environments to the EHR systems, a special consideration needs to be given to the cloud enablement of Veterans Health Information Systems and Technology Architecture (VistA), i.e....
Jan. 25, 2015 09:00 AM EST Reads: 3,765
"At Harbinger we do products as well as services. Our services are with helping companies move their products to the cloud operating systems. Some of the challenges we have seen as far as cloud adoption goes are in the cloud security space," noted Shrikant Pattathil, Executive Vice President at Harbinger Systems, in this SYS-CON.tv interview at Cloud Expo, held Nov 4–6, 2014, at the Santa Clara Convention Center in Santa Clara, CA.
Jan. 25, 2015 09:00 AM EST Reads: 3,073
After a couple of false starts, cloud-based desktop solutions are picking up steam, driven by trends such as BYOD and pervasive high-speed connectivity. In his session at 15th Cloud Expo, Seth Bostock, CEO of IndependenceIT, cut through the hype and the acronyms, and discussed the emergence of full-featured cloud workspaces that do for the desktop what cloud infrastructure did for the server. He also discussed VDI vs DaaS, implementation strategies and evaluation criteria.
Jan. 25, 2015 09:00 AM EST Reads: 3,757
Things are being built upon cloud foundations to transform organizations. This CEO Power Panel at 15th Cloud Expo, moderated by Roger Strukhoff, Cloud Expo and @ThingsExpo conference chair, addressed the big issues involving these technologies and, more important, the results they will achieve. Rodney Rogers, chairman and CEO of Virtustream; Brendan O'Brien, co-founder of Aria Systems, Bart Copeland, president and CEO of ActiveState Software; Jim Cowie, chief scientist at Dyn; Dave Wagstaff, VP ...
Jan. 25, 2015 09:00 AM EST Reads: 3,602
“The year of the cloud – we have no idea when it's really happening but we think it's happening now. For those technology providers like Zentera that are helping enterprises move to the cloud - it's been fun to watch," noted Mike Loftus, VP Product Management and Marketing at Zentera Systems, in this SYS-CON.tv interview at Cloud Expo, held Nov 4–6, 2014, at the Santa Clara Convention Center in Santa Clara, CA.
Jan. 25, 2015 09:00 AM EST Reads: 2,824
Bring the world's best IaaS to the world's best PaaS. In their session at 15th Cloud Expo, Animesh Singh, a Senior Cloud Architect and Strategist for IBM Cloud Labs, and Jason Anderson, a Cloud Architect for IBM Cloud Labs, shared their experiences running Cloud Foundry on OpenStack. They focused on how Cloud Foundry and OpenStack complement each other, how they technically integrate using cloud provider interface (CPI), how we could automate an OpenStack setup for Cloud Foundry deployments, an...
Jan. 25, 2015 09:00 AM EST Reads: 3,870
"Application monitoring and intelligence can smooth the path in a DevOps environment. In a DevOps environment you see constant change. If you are trying to monitor things in a constantly changing environment, you're going to spend a lot of your job fixing your monitoring," explained Todd Rader, Solutions Architect at AppDynamics, in this SYS-CON.tv interview at DevOps Summit, held Nov 4–6, 2014, at the Santa Clara Convention Center in Santa Clara, CA.
Jan. 25, 2015 09:00 AM EST Reads: 3,958
"Desktop as a Service is emerging as a very big trend. One of the big influencers of this – for Esri – is that we have a large user base that uses virtualization and they are looking at Desktop as a Service right now," explained John Meza, Product Engineer at Esri, in this SYS-CON.tv interview at Cloud Expo, held Nov 4–6, 2014, at the Santa Clara Convention Center in Santa Clara, CA.
Jan. 25, 2015 09:00 AM EST Reads: 3,081
The consumption economy is here and so are cloud applications and solutions that offer more than subscription and flat fee models and at the same time are available on a pure consumption model, which not only reduces IT spend but also lowers infrastructure costs, and offers ease of use and availability. In their session at 15th Cloud Expo, Ermanno Bonifazi, CEO & Founder of Solgenia, and Ian Khan, Global Strategic Positioning & Brand Manager at Solgenia, discussed this shifting dynamic with an ...
Jan. 25, 2015 09:00 AM EST Reads: 3,176
More and more file-based and machine generated data is being created every day causing exponential data and content growth, and creating a management nightmare for IT managers. What data centers really need to cope with this growth is a purpose-built tiered archive appliance that enables users to establish a single storage target for all of their applications - an appliance that will intelligently place and move data to and between storage tiers based on user-defined policies. In her session a...
Jan. 25, 2015 09:00 AM EST Reads: 3,774
IBM has announced a new strategic technology services agreement with Anthem, Inc., a health benefits company in the U.S. IBM has been selected to provide operational services for Anthem's mainframe and data center server and storage infrastructure for the next five years. Among the benefits of the relationship, Anthem has the ability to leverage IBM Cloud solutions that will help increase the ease, availability and speed of adding infrastructure to support new business requirements.
Jan. 25, 2015 08:00 AM EST Reads: 1,843
CodeFutures has announced Dan Lynn as its new CEO. Lynn assumes the role from Founder Cory Isaacson, who has joined RMS and will now serve as chairman of CodeFutures. Lynn brings more than 14 years of advanced technology and business success experience, and will help CodeFutures build on its industry leadership around its Agile Big Data initiatives. His technical expertise will be invaluable in advancing CodeFutures’ AgilData platform and new processes for streamlining and gaining value from gro...
Jan. 25, 2015 07:00 AM EST Reads: 2,074
Datapipe has acquired GoGrid, a provider of multi-cloud solutions for Big Data deployments. GoGrid’s proprietary orchestration and automation technologies provide 1-Button deployment for Big Data solutions that speed creation and results of new cloud projects. “GoGrid has made it easy for companies to stand up Big Data solutions quickly,” said Robb Allen, CEO, Datapipe. “Datapipe customers will achieve significant value from the speed at which we can now create new Big Data projects in the clou...
Jan. 25, 2015 03:30 AM EST Reads: 2,388
“DevOps is really about the business. The business is under pressure today, competitively in the marketplace to respond to the expectations of the customer. The business is driving IT and the problem is that IT isn't responding fast enough," explained Mark Levy, Senior Product Marketing Manager at Serena Software, in this SYS-CON.tv interview at DevOps Summit, held Nov 4–6, 2014, at the Santa Clara Convention Center in Santa Clara, CA.
Jan. 25, 2015 03:00 AM EST Reads: 3,419
Code Halos - aka "digital fingerprints" - are the key organizing principle to understand a) how dumb things become smart and b) how to monetize this dynamic. In his session at @ThingsExpo, Robert Brown, AVP, Center for the Future of Work at Cognizant Technology Solutions, outlined research, analysis and recommendations from his recently published book on this phenomena on the way leading edge organizations like GE and Disney are unlocking the Internet of Things opportunity and what steps your o...
Jan. 25, 2015 02:00 AM EST Reads: 4,169
The Industrial Internet revolution is now underway, enabled by connected machines and billions of devices that communicate and collaborate. The massive amounts of Big Data requiring real-time analysis is flooding legacy IT systems and giving way to cloud environments that can handle the unpredictable workloads. Yet many barriers remain until we can fully realize the opportunities and benefits from the convergence of machines and devices with Big Data and the cloud, including interoperability, ...
Jan. 25, 2015 01:45 AM EST Reads: 3,390
In their session at @ThingsExpo, Shyam Varan Nath, Principal Architect at GE, and Ibrahim Gokcen, who leads GE's advanced IoT analytics, focused on the Internet of Things / Industrial Internet and how to make it operational for business end-users. Learn about the challenges posed by machine and sensor data and how to marry it with enterprise data. They also discussed the tips and tricks to provide the Industrial Internet as an end-user consumable service using Big Data Analytics and Industrial C...
Jan. 25, 2015 01:00 AM EST Reads: 3,749