Welcome!

@CloudExpo Authors: Elizabeth White, Pat Romanski, Liz McMillan, Yeshim Deniz, Aruna Ravichandran

Related Topics: @CloudExpo, Microservices Expo, Agile Computing, Cloud Security

@CloudExpo: Blog Post

Risk versus Threat

It's more than just semantics

I was chatting with an IT professional about the benefits of cloud-based security and he kept referring to a recent risk assessment he performed. (And if you haven’t done this lately, you should) But what got the gears in my head turning is how interchangeably he used the terms “risk” and “threat.”

Now on the surface they seem like the same component of security management. I tend to disagree. In its simplest of terms, risk the probability or frequency of doing harm while threat is the actual or attempted infliction of that harm. Tomato, tomahto? Splitting hairs? It’s all about keeping your IT assets protected, right?

Well…yes…and no. Call it a pet peeve, but I believe risk and threat, although related, are two different beasts altogether. Risk includes variables. It overviews vulnerabilities, weighs challenges and opportunities to come up with an outcome. And there is risk in every action you take; some of it is so low that it poses no challenge to your architectures. For example, let’s say you note traffic pinging on a SQL server from a Bangladesh IP address. Let’s further assume there is unique or proprietary data living on that server. Is this a threat? Could be; but there are too many variables that must play out to properly assess its risk value. What is the source of the IP? Is this traffic normal and follow predictable patterns? Does it happen on off hours or during the course of business? Does it bypass any layered protocol or does it use authenticated means? Now, if I said this was a customer or vendor, does this change whether it is a threat? It might based on your definitions, but the point here is there is risk with every event or log, but whether that event is a threat is debatable.

And if you add “vulnerability” into the mix it creates a third dimension when assessing risk--vulnerability is a state of being--a weakness or gap in your security. A threat can exploit (intentionally or unintentionally) a vulnerability that is determined by a risk assessment.. Then of course you add likelihood. How realistic is this event to actually happen? Going back to our SQL example: if your SQL server has no external endpoints, the likelihood of an attack is considerably smaller than let’s say malware finding its way to a web or email server. Think of it another way. Sharks are a threat. Sharks live in water. You own a pool. What is the risk of finding a shark in your pool? But you also have a toddler. How realistic is it to think that the toddler might wander past your security gate (when was the last time the locks were checked?) and fall into the pool? Maybe it's time to buy a pool cover. Just because you make an assessment and recognize certain gaps in protection doesn’t mean you have taken steps to mitigate it. Now let’s put it all together…

Threat x (Vulnerability + Likelihood)/Impact = Risk

Let’s put a face on it. Emails come to your enterprise workstations hundreds or thousands of times a day. As the user is typically the weakest link in the security chain you have to depend on education and good sense to keep them from clicking on something potentially harmful. If they click on it what is the probability or likelihood that it will have some degree of negative effect on your IT environment. How quickly and where would it spread? And lastly what is the damage done, what resources will be lost at what cost? Apply this to every security circumstance and there you have a risk assessment. And once you have a risk assessment you can start planning to shore up vulnerabilities against threats.

But rather than debating semantics, let’s look at this as best practice. How do you best put together a risk assessment and/or a threat assessment? There are a great many experts who can spend endless consulting dollars answering that question, but bottom line is that you weigh each risk to determine whether it poses enough of a hazard to take action. And these hazards can all be quantified in terms of dollars. If the pain threshold is high enough, then certain steps are taken to mitigate the threat.

So how does this intersect with the cloud? It all goes back to resources. Do you have the technology, the budget, and/or the manpower to analyze every blip or define/escalate every event? Security–as-a-service helps lift that burden by employing 24/7/365 monitoring and using your applying your risk assessments to best defend your IT assets in real time. Once you define what events that pose the greatest threats, you can prioritize response and take appropriate action without impacting your departmental staff.

In short, even the best risk assessment and mitigation measures leave a certain amount of residual risk, either because one can’t mitigate totally against all the risks or because of the element of chance. But, by better understanding the difference between ‘threat’ and ‘risk’ can help you make decisions that will keep your systems safer and avoid unnecessary costs.

More Stories By Kevin Nikkhoo

With more than 32 years of experience in information technology, and an extensive and successful entrepreneurial background, Kevin Nikkhoo is the CEO of the dynamic security-as-a-service startup Cloud Access. CloudAccess is at the forefront of the latest evolution of IT asset protection--the cloud.

Kevin holds a Bachelor of Science in Computer Engineering from McGill University, Master of Computer Engineering at California State University, Los Angeles, and an MBA from the University of Southern California with emphasis in entrepreneurial studies.

@CloudExpo Stories
The session is centered around the tracing of systems on cloud using technologies like ebpf. The goal is to talk about what this technology is all about and what purpose it serves. In his session at 21st Cloud Expo, Shashank Jain, Development Architect at SAP, will touch upon concepts of observability in the cloud and also some of the challenges we have. Generally most cloud-based monitoring tools capture details at a very granular level. To troubleshoot problems this might not be good enough.
The dynamic nature of the cloud means that change is a constant when it comes to modern cloud-based infrastructure. Delivering modern applications to end users, therefore, is a constantly shifting challenge. Delivery automation helps IT Ops teams ensure that apps are providing an optimal end user experience over hybrid-cloud and multi-cloud environments, no matter what the current state of the infrastructure is. To employ a delivery automation strategy that reflects your business rules, making r...
Most technology leaders, contemporary and from the hardware era, are reshaping their businesses to do software. They hope to capture value from emerging technologies such as IoT, SDN, and AI. Ultimately, irrespective of the vertical, it is about deriving value from independent software applications participating in an ecosystem as one comprehensive solution. In his session at @ThingsExpo, Kausik Sridhar, founder and CTO of Pulzze Systems, will discuss how given the magnitude of today's applicati...
Smart cities have the potential to change our lives at so many levels for citizens: less pollution, reduced parking obstacles, better health, education and more energy savings. Real-time data streaming and the Internet of Things (IoT) possess the power to turn this vision into a reality. However, most organizations today are building their data infrastructure to focus solely on addressing immediate business needs vs. a platform capable of quickly adapting emerging technologies to address future ...
Data scientists must access high-performance computing resources across a wide-area network. To achieve cloud-based HPC visualization, researchers must transfer datasets and visualization results efficiently. HPC clusters now compute GPU-accelerated visualization in the cloud cluster. To efficiently display results remotely, a high-performance, low-latency protocol transfers the display from the cluster to a remote desktop. Further, tools to easily mount remote datasets and efficiently transfer...
In a recent survey, Sumo Logic surveyed 1,500 customers who employ cloud services such as Amazon Web Services (AWS), Microsoft Azure, and Google Cloud Platform (GCP). According to the survey, a quarter of the respondents have already deployed Docker containers and nearly as many (23 percent) are employing the AWS Lambda serverless computing framework. It’s clear: serverless is here to stay. The adoption does come with some needed changes, within both application development and operations. Tha...
In his Opening Keynote at 21st Cloud Expo, John Considine, General Manager of IBM Cloud Infrastructure, will lead you through the exciting evolution of the cloud. He'll look at this major disruption from the perspective of technology, business models, and what this means for enterprises of all sizes. John Considine is General Manager of Cloud Infrastructure Services at IBM. In that role he is responsible for leading IBM’s public cloud infrastructure including strategy, development, and offering ...
Companies are harnessing data in ways we once associated with science fiction. Analysts have access to a plethora of visualization and reporting tools, but considering the vast amount of data businesses collect and limitations of CPUs, end users are forced to design their structures and systems with limitations. Until now. As the cloud toolkit to analyze data has evolved, GPUs have stepped in to massively parallel SQL, visualization and machine learning.
We all know that end users experience the Internet primarily with mobile devices. From an app development perspective, we know that successfully responding to the needs of mobile customers depends on rapid DevOps – failing fast, in short, until the right solution evolves in your customers' relationship to your business. Whether you’re decomposing an SOA monolith, or developing a new application cloud natively, it’s not a question of using microservices – not doing so will be a path to eventual b...
Enterprises are adopting Kubernetes to accelerate the development and the delivery of cloud-native applications. However, sharing a Kubernetes cluster between members of the same team can be challenging. And, sharing clusters across multiple teams is even harder. Kubernetes offers several constructs to help implement segmentation and isolation. However, these primitives can be complex to understand and apply. As a result, it’s becoming common for enterprises to end up with several clusters. Thi...
SYS-CON Events announced today that Taica will exhibit at the Japan External Trade Organization (JETRO) Pavilion at SYS-CON's 21st International Cloud Expo®, which will take place on Oct 31 – Nov 2, 2017, at the Santa Clara Convention Center in Santa Clara, CA. TAZMO technology and development capabilities in the semiconductor and LCD-related manufacturing fields are among the best worldwide. For more information, visit https://www.tazmo.co.jp/en/.
SYS-CON Events announced today that TidalScale will exhibit at SYS-CON's 21st International Cloud Expo®, which will take place on Oct 31 – Nov 2, 2017, at the Santa Clara Convention Center in Santa Clara, CA. TidalScale is the leading provider of Software-Defined Servers that bring flexibility to modern data centers by right-sizing servers on the fly to fit any data set or workload. TidalScale’s award-winning inverse hypervisor technology combines multiple commodity servers (including their ass...
SYS-CON Events announced today that Avere Systems, a leading provider of hybrid cloud enablement solutions, will exhibit at SYS-CON's 21st International Cloud Expo®, which will take place on Oct 31 – Nov 2, 2017, at the Santa Clara Convention Center in Santa Clara, CA. Avere Systems was created by file systems experts determined to reinvent storage by changing the way enterprises thought about and bought storage resources. With decades of experience behind the company’s founders, Avere got its ...
Microsoft Azure Container Services can be used for container deployment in a variety of ways including support for Orchestrators like Kubernetes, Docker Swarm and Mesos. However, the abstraction for app development that support application self-healing, scaling and so on may not be at the right level. Helm and Draft makes this a lot easier. In this primarily demo-driven session at @DevOpsSummit at 21st Cloud Expo, Raghavan "Rags" Srinivas, a Cloud Solutions Architect/Evangelist at Microsoft, wi...
The next XaaS is CICDaaS. Why? Because CICD saves developers a huge amount of time. CD is an especially great option for projects that require multiple and frequent contributions to be integrated. But… securing CICD best practices is an emerging, essential, yet little understood practice for DevOps teams and their Cloud Service Providers. The only way to get CICD to work in a highly secure environment takes collaboration, patience and persistence. Building CICD in the cloud requires rigorous ar...
Containers are rapidly finding their way into enterprise data centers, but change is difficult. How do enterprises transform their architecture with technologies like containers without losing the reliable components of their current solutions? In his session at @DevOpsSummit at 21st Cloud Expo, Tony Campbell, Director, Educational Services at CoreOS, will explore the challenges organizations are facing today as they move to containers and go over how Kubernetes applications can deploy with lega...
In his session at 21st Cloud Expo, Raju Shreewastava, founder of Big Data Trunk, will provide a fun and simple way to introduce Machine Leaning to anyone and everyone. Together we will solve a machine learning problem and find an easy way to be able to do machine learning without even coding. Raju Shreewastava is the founder of Big Data Trunk (www.BigDataTrunk.com), a Big Data Training and consulting firm with offices in the United States. He previously led the data warehouse/business intellige...
Today most companies are adopting or evaluating container technology - Docker in particular - to speed up application deployment, drive down cost, ease management and make application delivery more flexible overall. As with most new architectures, this dream takes significant work to become a reality. Even when you do get your application componentized enough and packaged properly, there are still challenges for DevOps teams to making the shift to continuous delivery and achieving that reducti...
As hybrid cloud becomes the de-facto standard mode of operation for most enterprises, new challenges arise on how to efficiently and economically share data across environments. In his session at 21st Cloud Expo, Dr. Allon Cohen, VP of Product at Elastifile, will explore new techniques and best practices that help enterprise IT benefit from the advantages of hybrid cloud environments by enabling data availability for both legacy enterprise and cloud-native mission critical applications. By rev...
SYS-CON Events announced today that Ryobi Systems will exhibit at the Japan External Trade Organization (JETRO) Pavilion at SYS-CON's 21st International Cloud Expo®, which will take place on Oct 31 – Nov 2, 2017, at the Santa Clara Convention Center in Santa Clara, CA. Ryobi Systems Co., Ltd., as an information service company, specialized in business support for local governments and medical industry. We are challenging to achive the precision farming with AI. For more information, visit http:...