Welcome!

@CloudExpo Authors: Zakia Bouachraoui, Yeshim Deniz, Liz McMillan, Pat Romanski, Elizabeth White

Related Topics: @CloudExpo, Java IoT, Microservices Expo, Agile Computing, Cloud Security, SDN Journal

@CloudExpo: Article

"Continuous" Does Not Equal Real Time

Continuous monitoring is enough for compliance, but ISN’T enough for securing data

Every 4,000 miles or so I bring my car into have the oil changed, the brakes checked and tires rotated. Why? Because I know if I leave it to chance, at some point down the road something much more devastating will affect the car. Many of us follow this simple preventive best practice.

Then why is it major corporations and modest enterprises alike wait until their security is breached to address growing concerns of data theft, private information leakage or worse? Many of these companies spend hundreds of thousands of dollars in various security initiatives (especially those bound by a regulatory compliance agency), but still succumb to breaches that cost on average 3.8 million dollars (Ponemon Institute figure) per occurrence to address.

Two instances dropped into my in box this week, a medical center in Long Beach, California and a Medicaid office in New York State both experienced similar types of breaches that, in my opinion, were completely preventable.

It boils down to continuous monitoring...and that practice doesn't go far enough.

Continuous monitoring is the cornerstone of many compliance mandates. You find it in HIPAA, PCI, FISMA, etc. Something--usually an archival solution gathering sys-log files—must collect records of every event that touches a network perimeter. For a medium size health care facility, that could be more than 500 events per second. For larger companies, like the Long Beach Medical Center and Office of Medicaid Inspector General, the likelihood of continuous activity is 5X that amount. That's a lot of lines of code to comb through to find that breach.

Many hospitals and health care organizations are under great strain to maintain certain security and privacy protocols because of these compliance laws. They spend a great deal of time and money in security, but way too often, we hear of a breach by some facility or that company. There must be a disconnect somewhere.

I think the disconnect is how the term continuous monitoring is defined and applied as a preventive best practice. Mandates state that systems must be continuously monitored, but it can be vague in terms of how often those system logs are reviewed. I know of some organizations that only do it once per month. I know others that don’t do it that often. This is not to say that there is no vigilance out there, but the overarching issue is that no matter how often sys-logs are reviewed, it is done in a rear-view mirror. These are events that have already occurred. If there was a breach or any kind of suspicious or malicious activity, the horse has already left the barn. The damage is done.

Of course continuous monitoring is important. But it doesn’t go far enough. It is not truly preventive. The key is not continuous monitoring, but real-time monitoring--24/7/365.

But many companies don’t have the man-power, the expertise or the technology space to achieve this. And those that do, there is the invitation of extra costs. So they ask, if I am IN compliance, what is my motivation to incur more costs and expend more resources? Anyone who has ignored the red warning light on a dashboard saying it’s time for an oil change might be able to tell you. And so might the auditors dealing with the Long Beach Medical Center and New York Medicaid office breaches. You might be in compliance by the letter of the law, but not it's spirit.

However, those that say they need to spend more money and resources aren’t looking to the cloud. They might not be aware that the SIEM and Log Management developed, delivered and managed from the cloud exponentially increase their security capabilities while significantly limiting costs and headcount. They might not be aware that security-as-a-service is that real-time monitoring enhancement in the “sky” that immediately creates an alert the moment suspicious activity occur and initiate preventive protocols to better safeguard private records. They might not be aware that it can stitch together separate and disparate data silos under a centralized management portal to make spot reviewing, audit reporting and day-to-day maintenance much easier. Honestly if you can accomplish better results for less budget, then it is your duty to at least perform due diligence and explore the option.

This is important in terms of the root causes of the breaches I mentioned earlier. In both cases, the breaches seem to stem from internal sources using unregulated email.

How would real-time monitoring from the cloud have prevented this? Simple, if approached holistically. What is necessary is that credentialing and provisioning functions such as those found in identity management (IDaaS) and enterprise access control (access management) be leveraged with Log Management and correlated through SIEM (intrusion detection, alerting). It seems like trying to take a drink from a fire hose, but once all the data is leveraged and all the unique protocols, escalations, provisioning, rights and rules are centralized, then real-time monitoring can assess (removing all the false-positives and white noise) true threats to the network and take appropriate action…BEFORE the damage is done.

So my call to action is that it is time to reassess what it means to continuously monitor. And that means to find ways to start monitoring in real time and applying preventive and PROACTIVE best practices.

Kevin Nikkhoo

www.cloudaccess.com

More Stories By Kevin Nikkhoo

With more than 32 years of experience in information technology, and an extensive and successful entrepreneurial background, Kevin Nikkhoo is the CEO of the dynamic security-as-a-service startup Cloud Access. CloudAccess is at the forefront of the latest evolution of IT asset protection--the cloud.

Kevin holds a Bachelor of Science in Computer Engineering from McGill University, Master of Computer Engineering at California State University, Los Angeles, and an MBA from the University of Southern California with emphasis in entrepreneurial studies.

CloudEXPO Stories
Having been in the web hosting industry since 2002, dhosting has gained a great deal of experience while working on a wide range of projects. This experience has enabled the company to develop our amazing new product, which they are now excited to present! Among dHosting's greatest achievements, they can include the development of their own hosting panel, the building of their fully redundant server system, and the creation of dhHosting's unique product, Dynamic Edge.
Your job is mostly boring. Many of the IT operations tasks you perform on a day-to-day basis are repetitive and dull. Utilizing automation can improve your work life, automating away the drudgery and embracing the passion for technology that got you started in the first place. In this presentation, I'll talk about what automation is, and how to approach implementing it in the context of IT Operations. Ned will discuss keys to success in the long term and include practical real-world examples. Get started on automating your way to a brighter future!
The challenges of aggregating data from consumer-oriented devices, such as wearable technologies and smart thermostats, are fairly well-understood. However, there are a new set of challenges for IoT devices that generate megabytes or gigabytes of data per second. Certainly, the infrastructure will have to change, as those volumes of data will likely overwhelm the available bandwidth for aggregating the data into a central repository. Ochandarena discusses a whole new way to think about your next-gen applications and how to address the challenges of building applications that harness all data types and sources.
DXWorldEXPO LLC announced today that Big Data Federation to Exhibit at the 22nd International CloudEXPO, colocated with DevOpsSUMMIT and DXWorldEXPO, November 12-13, 2018 in New York City. Big Data Federation, Inc. develops and applies artificial intelligence to predict financial and economic events that matter. The company uncovers patterns and precise drivers of performance and outcomes with the aid of machine-learning algorithms, big data, and fundamental analysis. Their products are deployed by some of the world's largest financial institutions. The company develops and applies innovative machine-learning technologies to big data to predict financial, economic, and world events. The team is a group of passionate technologists, mathematicians, data scientists and programmers in Silicon Valley with over 100 patents to their names. Big Data Federation was incorporated in 2015 and is ...
Dynatrace is an application performance management software company with products for the information technology departments and digital business owners of medium and large businesses. Building the Future of Monitoring with Artificial Intelligence. Today we can collect lots and lots of performance data. We build beautiful dashboards and even have fancy query languages to access and transform the data. Still performance data is a secret language only a couple of people understand. The more business becomes digital the more stakeholders are interested in this data including how it relates to business. Some of these people have never used a monitoring tool before. They have a question on their mind like "How is my application doing" but no idea how to get a proper answer.