Welcome!

@CloudExpo Authors: Elizabeth White, Pat Romanski, Liz McMillan, Yeshim Deniz, Aruna Ravichandran

Related Topics: @CloudExpo, Java IoT, Microservices Expo, Agile Computing, Cloud Security, SDN Journal

@CloudExpo: Article

"Continuous" Does Not Equal Real Time

Continuous monitoring is enough for compliance, but ISN’T enough for securing data

Every 4,000 miles or so I bring my car into have the oil changed, the brakes checked and tires rotated. Why? Because I know if I leave it to chance, at some point down the road something much more devastating will affect the car. Many of us follow this simple preventive best practice.

Then why is it major corporations and modest enterprises alike wait until their security is breached to address growing concerns of data theft, private information leakage or worse? Many of these companies spend hundreds of thousands of dollars in various security initiatives (especially those bound by a regulatory compliance agency), but still succumb to breaches that cost on average 3.8 million dollars (Ponemon Institute figure) per occurrence to address.

Two instances dropped into my in box this week, a medical center in Long Beach, California and a Medicaid office in New York State both experienced similar types of breaches that, in my opinion, were completely preventable.

It boils down to continuous monitoring...and that practice doesn't go far enough.

Continuous monitoring is the cornerstone of many compliance mandates. You find it in HIPAA, PCI, FISMA, etc. Something--usually an archival solution gathering sys-log files—must collect records of every event that touches a network perimeter. For a medium size health care facility, that could be more than 500 events per second. For larger companies, like the Long Beach Medical Center and Office of Medicaid Inspector General, the likelihood of continuous activity is 5X that amount. That's a lot of lines of code to comb through to find that breach.

Many hospitals and health care organizations are under great strain to maintain certain security and privacy protocols because of these compliance laws. They spend a great deal of time and money in security, but way too often, we hear of a breach by some facility or that company. There must be a disconnect somewhere.

I think the disconnect is how the term continuous monitoring is defined and applied as a preventive best practice. Mandates state that systems must be continuously monitored, but it can be vague in terms of how often those system logs are reviewed. I know of some organizations that only do it once per month. I know others that don’t do it that often. This is not to say that there is no vigilance out there, but the overarching issue is that no matter how often sys-logs are reviewed, it is done in a rear-view mirror. These are events that have already occurred. If there was a breach or any kind of suspicious or malicious activity, the horse has already left the barn. The damage is done.

Of course continuous monitoring is important. But it doesn’t go far enough. It is not truly preventive. The key is not continuous monitoring, but real-time monitoring--24/7/365.

But many companies don’t have the man-power, the expertise or the technology space to achieve this. And those that do, there is the invitation of extra costs. So they ask, if I am IN compliance, what is my motivation to incur more costs and expend more resources? Anyone who has ignored the red warning light on a dashboard saying it’s time for an oil change might be able to tell you. And so might the auditors dealing with the Long Beach Medical Center and New York Medicaid office breaches. You might be in compliance by the letter of the law, but not it's spirit.

However, those that say they need to spend more money and resources aren’t looking to the cloud. They might not be aware that the SIEM and Log Management developed, delivered and managed from the cloud exponentially increase their security capabilities while significantly limiting costs and headcount. They might not be aware that security-as-a-service is that real-time monitoring enhancement in the “sky” that immediately creates an alert the moment suspicious activity occur and initiate preventive protocols to better safeguard private records. They might not be aware that it can stitch together separate and disparate data silos under a centralized management portal to make spot reviewing, audit reporting and day-to-day maintenance much easier. Honestly if you can accomplish better results for less budget, then it is your duty to at least perform due diligence and explore the option.

This is important in terms of the root causes of the breaches I mentioned earlier. In both cases, the breaches seem to stem from internal sources using unregulated email.

How would real-time monitoring from the cloud have prevented this? Simple, if approached holistically. What is necessary is that credentialing and provisioning functions such as those found in identity management (IDaaS) and enterprise access control (access management) be leveraged with Log Management and correlated through SIEM (intrusion detection, alerting). It seems like trying to take a drink from a fire hose, but once all the data is leveraged and all the unique protocols, escalations, provisioning, rights and rules are centralized, then real-time monitoring can assess (removing all the false-positives and white noise) true threats to the network and take appropriate action…BEFORE the damage is done.

So my call to action is that it is time to reassess what it means to continuously monitor. And that means to find ways to start monitoring in real time and applying preventive and PROACTIVE best practices.

Kevin Nikkhoo

www.cloudaccess.com

More Stories By Kevin Nikkhoo

With more than 32 years of experience in information technology, and an extensive and successful entrepreneurial background, Kevin Nikkhoo is the CEO of the dynamic security-as-a-service startup Cloud Access. CloudAccess is at the forefront of the latest evolution of IT asset protection--the cloud.

Kevin holds a Bachelor of Science in Computer Engineering from McGill University, Master of Computer Engineering at California State University, Los Angeles, and an MBA from the University of Southern California with emphasis in entrepreneurial studies.

@CloudExpo Stories
The session is centered around the tracing of systems on cloud using technologies like ebpf. The goal is to talk about what this technology is all about and what purpose it serves. In his session at 21st Cloud Expo, Shashank Jain, Development Architect at SAP, will touch upon concepts of observability in the cloud and also some of the challenges we have. Generally most cloud-based monitoring tools capture details at a very granular level. To troubleshoot problems this might not be good enough.
The dynamic nature of the cloud means that change is a constant when it comes to modern cloud-based infrastructure. Delivering modern applications to end users, therefore, is a constantly shifting challenge. Delivery automation helps IT Ops teams ensure that apps are providing an optimal end user experience over hybrid-cloud and multi-cloud environments, no matter what the current state of the infrastructure is. To employ a delivery automation strategy that reflects your business rules, making r...
Most technology leaders, contemporary and from the hardware era, are reshaping their businesses to do software. They hope to capture value from emerging technologies such as IoT, SDN, and AI. Ultimately, irrespective of the vertical, it is about deriving value from independent software applications participating in an ecosystem as one comprehensive solution. In his session at @ThingsExpo, Kausik Sridhar, founder and CTO of Pulzze Systems, will discuss how given the magnitude of today's applicati...
Smart cities have the potential to change our lives at so many levels for citizens: less pollution, reduced parking obstacles, better health, education and more energy savings. Real-time data streaming and the Internet of Things (IoT) possess the power to turn this vision into a reality. However, most organizations today are building their data infrastructure to focus solely on addressing immediate business needs vs. a platform capable of quickly adapting emerging technologies to address future ...
Data scientists must access high-performance computing resources across a wide-area network. To achieve cloud-based HPC visualization, researchers must transfer datasets and visualization results efficiently. HPC clusters now compute GPU-accelerated visualization in the cloud cluster. To efficiently display results remotely, a high-performance, low-latency protocol transfers the display from the cluster to a remote desktop. Further, tools to easily mount remote datasets and efficiently transfer...
In a recent survey, Sumo Logic surveyed 1,500 customers who employ cloud services such as Amazon Web Services (AWS), Microsoft Azure, and Google Cloud Platform (GCP). According to the survey, a quarter of the respondents have already deployed Docker containers and nearly as many (23 percent) are employing the AWS Lambda serverless computing framework. It’s clear: serverless is here to stay. The adoption does come with some needed changes, within both application development and operations. Tha...
Companies are harnessing data in ways we once associated with science fiction. Analysts have access to a plethora of visualization and reporting tools, but considering the vast amount of data businesses collect and limitations of CPUs, end users are forced to design their structures and systems with limitations. Until now. As the cloud toolkit to analyze data has evolved, GPUs have stepped in to massively parallel SQL, visualization and machine learning.
In his Opening Keynote at 21st Cloud Expo, John Considine, General Manager of IBM Cloud Infrastructure, will lead you through the exciting evolution of the cloud. He'll look at this major disruption from the perspective of technology, business models, and what this means for enterprises of all sizes. John Considine is General Manager of Cloud Infrastructure Services at IBM. In that role he is responsible for leading IBM’s public cloud infrastructure including strategy, development, and offering ...
We all know that end users experience the Internet primarily with mobile devices. From an app development perspective, we know that successfully responding to the needs of mobile customers depends on rapid DevOps – failing fast, in short, until the right solution evolves in your customers' relationship to your business. Whether you’re decomposing an SOA monolith, or developing a new application cloud natively, it’s not a question of using microservices – not doing so will be a path to eventual b...
Enterprises are adopting Kubernetes to accelerate the development and the delivery of cloud-native applications. However, sharing a Kubernetes cluster between members of the same team can be challenging. And, sharing clusters across multiple teams is even harder. Kubernetes offers several constructs to help implement segmentation and isolation. However, these primitives can be complex to understand and apply. As a result, it’s becoming common for enterprises to end up with several clusters. Thi...
SYS-CON Events announced today that Taica will exhibit at the Japan External Trade Organization (JETRO) Pavilion at SYS-CON's 21st International Cloud Expo®, which will take place on Oct 31 – Nov 2, 2017, at the Santa Clara Convention Center in Santa Clara, CA. TAZMO technology and development capabilities in the semiconductor and LCD-related manufacturing fields are among the best worldwide. For more information, visit https://www.tazmo.co.jp/en/.
SYS-CON Events announced today that TidalScale will exhibit at SYS-CON's 21st International Cloud Expo®, which will take place on Oct 31 – Nov 2, 2017, at the Santa Clara Convention Center in Santa Clara, CA. TidalScale is the leading provider of Software-Defined Servers that bring flexibility to modern data centers by right-sizing servers on the fly to fit any data set or workload. TidalScale’s award-winning inverse hypervisor technology combines multiple commodity servers (including their ass...
SYS-CON Events announced today that Avere Systems, a leading provider of hybrid cloud enablement solutions, will exhibit at SYS-CON's 21st International Cloud Expo®, which will take place on Oct 31 – Nov 2, 2017, at the Santa Clara Convention Center in Santa Clara, CA. Avere Systems was created by file systems experts determined to reinvent storage by changing the way enterprises thought about and bought storage resources. With decades of experience behind the company’s founders, Avere got its ...
Microsoft Azure Container Services can be used for container deployment in a variety of ways including support for Orchestrators like Kubernetes, Docker Swarm and Mesos. However, the abstraction for app development that support application self-healing, scaling and so on may not be at the right level. Helm and Draft makes this a lot easier. In this primarily demo-driven session at @DevOpsSummit at 21st Cloud Expo, Raghavan "Rags" Srinivas, a Cloud Solutions Architect/Evangelist at Microsoft, wi...
The next XaaS is CICDaaS. Why? Because CICD saves developers a huge amount of time. CD is an especially great option for projects that require multiple and frequent contributions to be integrated. But… securing CICD best practices is an emerging, essential, yet little understood practice for DevOps teams and their Cloud Service Providers. The only way to get CICD to work in a highly secure environment takes collaboration, patience and persistence. Building CICD in the cloud requires rigorous ar...
Containers are rapidly finding their way into enterprise data centers, but change is difficult. How do enterprises transform their architecture with technologies like containers without losing the reliable components of their current solutions? In his session at @DevOpsSummit at 21st Cloud Expo, Tony Campbell, Director, Educational Services at CoreOS, will explore the challenges organizations are facing today as they move to containers and go over how Kubernetes applications can deploy with lega...
In his session at 21st Cloud Expo, Raju Shreewastava, founder of Big Data Trunk, will provide a fun and simple way to introduce Machine Leaning to anyone and everyone. Together we will solve a machine learning problem and find an easy way to be able to do machine learning without even coding. Raju Shreewastava is the founder of Big Data Trunk (www.BigDataTrunk.com), a Big Data Training and consulting firm with offices in the United States. He previously led the data warehouse/business intellige...
Today most companies are adopting or evaluating container technology - Docker in particular - to speed up application deployment, drive down cost, ease management and make application delivery more flexible overall. As with most new architectures, this dream takes significant work to become a reality. Even when you do get your application componentized enough and packaged properly, there are still challenges for DevOps teams to making the shift to continuous delivery and achieving that reducti...
As hybrid cloud becomes the de-facto standard mode of operation for most enterprises, new challenges arise on how to efficiently and economically share data across environments. In his session at 21st Cloud Expo, Dr. Allon Cohen, VP of Product at Elastifile, will explore new techniques and best practices that help enterprise IT benefit from the advantages of hybrid cloud environments by enabling data availability for both legacy enterprise and cloud-native mission critical applications. By rev...
SYS-CON Events announced today that Ryobi Systems will exhibit at the Japan External Trade Organization (JETRO) Pavilion at SYS-CON's 21st International Cloud Expo®, which will take place on Oct 31 – Nov 2, 2017, at the Santa Clara Convention Center in Santa Clara, CA. Ryobi Systems Co., Ltd., as an information service company, specialized in business support for local governments and medical industry. We are challenging to achive the precision farming with AI. For more information, visit http:...