Welcome!

@CloudExpo Authors: Pat Romanski, Tim Hinds, Xenia von Wedel, William Schmarzo, Elizabeth White

Related Topics: @CloudExpo, Microservices Expo, Containers Expo Blog, Cloud Security, @DXWorldExpo, SDN Journal

@CloudExpo: Article

Sailing the Seven Cs of Security Monitoring

Establishing alliterative best practices for watching over your IT environment: from continuous to cloud!

What is it your mom used to say? “A watched pot never boils.” This might be true, but a watched pot also never spills; it never allows your younger sister to stick her hand in the hot water; prevents Uncle Jack from tasting before dinner is ready; and if something unforeseen happens, there is time to mitigate the problems.

One of the established best practices in InfoSec is monitoring. People, products and companies get paid a great deal of money and expend a great deal of resources to watch pots. Monitoring simply is the central component to any security initiative. If you don’t watch it, it still happens, (trees in forest fall and still make sounds), you’re simply not aware to possibly prevent the issue, to control the damage, or protect the assets for spiraling beyond your control. Monitoring is the baseline to accountability and responsibility. It provides the necessary information to make risk-based decisions regarding assets supporting core missions and business functions.

But with all best practices, there are variables. How much to monitor? What priorities matter? Where are my greatest vulnerabilities? To this end, I have boiled down monitoring to seven best practices…The 7 Cs of security monitoring:

  1. Consistency
  2. Continuous
  3. Correlation
  4. Contextual
  5. Compliant
  6. Centralization
  7. Cloud

Consistency – Every company is different. Each has their own thresholds of organizational risk. A credit union or health clinic is much more likely to need a higher bar than an air and heating contractor. However, this doesn’t mean the smaller company can ignore risk. It simply means (typically) the levels and layers that require monitoring are less complex. The key to consistency is process. And to divine a process you must first define a strategy, agree on the measures and metrics and follow through with a monitoring program. Start with understanding how your users interact with the network and the various risk that proposes. Once you know what needs to be monitored and the baselines (risk tolerance) of what constitutes alerts and other suspicious activity, then you can build a program and standardize that configuration and analyze the results to make adjustments. From there it is wash, rinse and repeat.

Recently the Department of Homeland Security director of federal network resilience noted: as you move to standardize configurations networks are not only more secure but they lower operational costs. “There is almost a trifecta of controlling cost, increasing service and improving security, he said.

Continuous –Hackers don’t sleep, so why should your security?  It is understood that continuous monitoring is the best method to prevent breaches, discover anomalies and, and control assets. However, there are differences of opinion as to what does continuous mean. Are you to hire a dedicated analyst to watch every ping, blurp and log? Guards armed with wiener dog lasers in front of your server room? Of course not. In this case, our working definition of “continuous” is unique for every organization and needs to be commensurate with their risk and resources. NIST (National Institute of Standards and Technology) recommends an ongoing “frequency sufficient to support risk-based security decisions as needed to adequately protect organization information.” Despite the variable vagueness of that statement, the goal nonetheless must be 7/24/365 coverage. To achieve this degree of continuity, an initiative requires a series of automated processes and controls combined with the expertise to analyze vulnerability and initiate action. Yet the lynchpin for effectiveness of a round-the-clock strategy is that it is doing in real time. See the “C” for cloud, to show you this approach is affordable, efficient and manageable. If there are issues, as you define them, you get the alerts immediately, not a week later as you look through log transcripts. Continuous monitoring is about proactivity, as much as it is about response. In that it allows for such immediacy in action mitigates any potential threat.

Continuous monitoring has been defined by NIST and the SANS 20 Critical Security Controls as key to reducing risk in IT environments. Now I am not saying continuous monitoring is a silver bullet, but it certainly lessens the possibility of attack, carelessness and operational failure.

Correlation: In the modern enterprise, there are simply too many silos of information, too many endpoints for access, too many variables of risk and not enough visibility or resources to properly protect all the assets of an enterprise. Monitoring in its simplest form looks at one of the silos, one of the applications– it examines possible events, or log-ins, or credentials. To enhance the effectiveness, there needs to be a tight collaboration of all the resources. This expands the visibility and creates a more accurate view of all online and network assets. Correlation needs to tie together the cooperative capabilities of such tools as SIEM, Log Management, Identity and Access Management, malware scanning, etc… If security is about maintaining visibility, correlation would be its magnifying glass. Or to mix my metaphors, it’s like a lens on a camera that can bring blurry visions into sharp focus. For example good correlation removes the specter of false positives and more. Consider, the entitlement management configuration from an Access Management feature set is part of the correlation engine of SIEM to help distinguish authorized access from suspicious activity. The resulting alerts happen in real time and provide the directed response necessary to remediate any issues. Additionally, all of this detail is historically recorded for various reports and compliance regulations through the log management capabilities.

Correlation is rooted in the aspects of consistency. You first need to know the landscape in order to create the rules. The rules of correlation create the baseline in which to manage a consistent initiative. This also goes a long way in underscoring the next 2 C’s Context and Compliance.

Context: Automation can make the process of continuous monitoring more cost-effective, consistent, and efficient. But continuous monitoring without intelligence can result in simply more data. For example, the network processes an application log in request from an approved user name and password. That in itself is not remarkable. However, the IP address doesn’t match the user’s usual location or a device’s usual behavior. This one is coming from Zagreb. Is Mike from sales in Zagreb? The system says no, because only 4 short hours ago he was logging off from an office in Denver. This situational awareness raises a red flag and escalates an alert. And because this is done in real time, IT catches the activity and is able to block access.

Compliance: The common thread for the alphabet soup that is compliance (HIPAA, PCI, FISMA, FFIEC, CIP, SOX, etc…) is the need to know who is logging in, accessing what assets and ensuring only the appropriately credentialed users can do those things. When you are dealing with sensitive information like credit card numbers, social security numbers, patient history/records, and the like, the need to have a strong and continuous monitoring initiative is not just a driving force to avoid fines, but it is the basis of good and trustworthy operation.

So much has been written about compliance and network security, so that all I will add is understand the responsibility you have towards customers, partners, employees, users, accurately calculate the risk in maintaining their information and vigilantly maintain the monitoring process that makes you a good steward of their trust. And of course, a solid monitoring strategy will provide the industry regulators the reporting and evidence of your compliance.

Centralization: With all the moving parts and all the silos, device types and elements to monitor, without a means to centralize, a security infrastructure becomes disjointed, uncoordinated and considerably harder to manage. The continual increase in daily network threats and attacks makes it challenging to maintain not only a complex heterogeneous environment but to also ensure compliancy by deploying network-wide security policies. The ability to forensically analyze the infrastructure under a single pane of glass is not just a convenience factor, but one that seals up the vulnerability cracks.

Cloud: Best practice monitoring requires more than just a pair of eyes. The strategy includes investment in a variety of solutions, tools, servers, analysts and more. For many companies, this is not tenable in terms of human resources, budgets and core competencies. This is why continuous monitoring from the cloud (aka security-as-a-service) provides the great equalizer. Through the application of cloud-based security, a small health clinic in Bozeman, Montana can wrangle to same enterprise capabilities as New York Presbyterian. The only difference is the necessary scale to achieve a strong deployment and sustainable initiative.

Addressing the issue from the cloud solves several pressing issues while providing the necessary heft to create the visibility to govern credentialing policies, remediate threats and satisfy compliance requirements across any sized enterprise. What’s more, all the solutions noted from above – from SIEM to Access Management—are available from the cloud. And there are a few providers that can harness all the solutions collectively and centralize them under that single pain of glass.

As you embark to set sail on the 7 Cs, leave a note for your mother to watch the pot.

Kevin Nikkhoo
Captain of Continuous Monitoring
CloudAccess

More Stories By Kevin Nikkhoo

With more than 32 years of experience in information technology, and an extensive and successful entrepreneurial background, Kevin Nikkhoo is the CEO of the dynamic security-as-a-service startup Cloud Access. CloudAccess is at the forefront of the latest evolution of IT asset protection--the cloud.

Kevin holds a Bachelor of Science in Computer Engineering from McGill University, Master of Computer Engineering at California State University, Los Angeles, and an MBA from the University of Southern California with emphasis in entrepreneurial studies.

@CloudExpo Stories
While some developers care passionately about how data centers and clouds are architected, for most, it is only the end result that matters. To the majority of companies, technology exists to solve a business problem, and only delivers value when it is solving that problem. 2017 brings the mainstream adoption of containers for production workloads. In his session at 21st Cloud Expo, Ben McCormack, VP of Operations at Evernote, discussed how data centers of the future will be managed, how the p...
Sometimes I write a blog just to formulate and organize a point of view, and I think it’s time that I pull together the bounty of excellent information about Machine Learning. This is a topic with which business leaders must become comfortable, especially tomorrow’s business leaders (tip for my next semester University of San Francisco business students!). Machine learning is a key capability that will help organizations drive optimization and monetization opportunities, and there have been some...
"Storpool does only block-level storage so we do one thing extremely well. The growth in data is what drives the move to software-defined technologies in general and software-defined storage," explained Boyan Ivanov, CEO and co-founder at StorPool, in this SYS-CON.tv interview at 16th Cloud Expo, held June 9-11, 2015, at the Javits Center in New York City.
A strange thing is happening along the way to the Internet of Things, namely far too many devices to work with and manage. It has become clear that we'll need much higher efficiency user experiences that can allow us to more easily and scalably work with the thousands of devices that will soon be in each of our lives. Enter the conversational interface revolution, combining bots we can literally talk with, gesture to, and even direct with our thoughts, with embedded artificial intelligence, whic...
As DevOps methodologies expand their reach across the enterprise, organizations face the daunting challenge of adapting related cloud strategies to ensure optimal alignment, from managing complexity to ensuring proper governance. How can culture, automation, legacy apps and even budget be reexamined to enable this ongoing shift within the modern software factory? In her Day 2 Keynote at @DevOpsSummit at 21st Cloud Expo, Aruna Ravichandran, VP, DevOps Solutions Marketing, CA Technologies, was jo...
As Marc Andreessen says software is eating the world. Everything is rapidly moving toward being software-defined – from our phones and cars through our washing machines to the datacenter. However, there are larger challenges when implementing software defined on a larger scale - when building software defined infrastructure. In his session at 16th Cloud Expo, Boyan Ivanov, CEO of StorPool, provided some practical insights on what, how and why when implementing "software-defined" in the datacent...
Blockchain. A day doesn’t seem to go by without seeing articles and discussions about the technology. According to PwC executive Seamus Cushley, approximately $1.4B has been invested in blockchain just last year. In Gartner’s recent hype cycle for emerging technologies, blockchain is approaching the peak. It is considered by Gartner as one of the ‘Key platform-enabling technologies to track.’ While there is a lot of ‘hype vs reality’ discussions going on, there is no arguing that blockchain is b...
Blockchain is a shared, secure record of exchange that establishes trust, accountability and transparency across business networks. Supported by the Linux Foundation's open source, open-standards based Hyperledger Project, Blockchain has the potential to improve regulatory compliance, reduce cost as well as advance trade. Are you curious about how Blockchain is built for business? In her session at 21st Cloud Expo, René Bostic, Technical VP of the IBM Cloud Unit in North America, discussed the b...
You know you need the cloud, but you’re hesitant to simply dump everything at Amazon since you know that not all workloads are suitable for cloud. You know that you want the kind of ease of use and scalability that you get with public cloud, but your applications are architected in a way that makes the public cloud a non-starter. You’re looking at private cloud solutions based on hyperconverged infrastructure, but you’re concerned with the limits inherent in those technologies.
Is advanced scheduling in Kubernetes achievable?Yes, however, how do you properly accommodate every real-life scenario that a Kubernetes user might encounter? How do you leverage advanced scheduling techniques to shape and describe each scenario in easy-to-use rules and configurations? In his session at @DevOpsSummit at 21st Cloud Expo, Oleg Chunikhin, CTO at Kublr, answered these questions and demonstrated techniques for implementing advanced scheduling. For example, using spot instances and co...
The use of containers by developers -- and now increasingly IT operators -- has grown from infatuation to deep and abiding love. But as with any long-term affair, the honeymoon soon leads to needing to live well together ... and maybe even getting some relationship help along the way. And so it goes with container orchestration and automation solutions, which are rapidly emerging as the means to maintain the bliss between rapid container adoption and broad container use among multiple cloud host...
The cloud era has reached the stage where it is no longer a question of whether a company should migrate, but when. Enterprises have embraced the outsourcing of where their various applications are stored and who manages them, saving significant investment along the way. Plus, the cloud has become a defining competitive edge. Companies that fail to successfully adapt risk failure. The media, of course, continues to extol the virtues of the cloud, including how easy it is to get there. Migrating...
Imagine if you will, a retail floor so densely packed with sensors that they can pick up the movements of insects scurrying across a store aisle. Or a component of a piece of factory equipment so well-instrumented that its digital twin provides resolution down to the micrometer.
The need for greater agility and scalability necessitated the digital transformation in the form of following equation: monolithic to microservices to serverless architecture (FaaS). To keep up with the cut-throat competition, the organisations need to update their technology stack to make software development their differentiating factor. Thus microservices architecture emerged as a potential method to provide development teams with greater flexibility and other advantages, such as the abili...
In his keynote at 18th Cloud Expo, Andrew Keys, Co-Founder of ConsenSys Enterprise, provided an overview of the evolution of the Internet and the Database and the future of their combination – the Blockchain. Andrew Keys is Co-Founder of ConsenSys Enterprise. He comes to ConsenSys Enterprise with capital markets, technology and entrepreneurial experience. Previously, he worked for UBS investment bank in equities analysis. Later, he was responsible for the creation and distribution of life settle...
Product connectivity goes hand and hand these days with increased use of personal data. New IoT devices are becoming more personalized than ever before. In his session at 22nd Cloud Expo | DXWorld Expo, Nicolas Fierro, CEO of MIMIR Blockchain Solutions, will discuss how in order to protect your data and privacy, IoT applications need to embrace Blockchain technology for a new level of product security never before seen - or needed.
Leading companies, from the Global Fortune 500 to the smallest companies, are adopting hybrid cloud as the path to business advantage. Hybrid cloud depends on cloud services and on-premises infrastructure working in unison. Successful implementations require new levels of data mobility, enabled by an automated and seamless flow across on-premises and cloud resources. In his general session at 21st Cloud Expo, Greg Tevis, an IBM Storage Software Technical Strategist and Customer Solution Architec...
Nordstrom is transforming the way that they do business and the cloud is the key to enabling speed and hyper personalized customer experiences. In his session at 21st Cloud Expo, Ken Schow, VP of Engineering at Nordstrom, discussed some of the key learnings and common pitfalls of large enterprises moving to the cloud. This includes strategies around choosing a cloud provider(s), architecture, and lessons learned. In addition, he covered some of the best practices for structured team migration an...
In his general session at 21st Cloud Expo, Greg Dumas, Calligo’s Vice President and G.M. of US operations, discussed the new Global Data Protection Regulation and how Calligo can help business stay compliant in digitally globalized world. Greg Dumas is Calligo's Vice President and G.M. of US operations. Calligo is an established service provider that provides an innovative platform for trusted cloud solutions. Calligo’s customers are typically most concerned about GDPR compliance, application p...
Coca-Cola’s Google powered digital signage system lays the groundwork for a more valuable connection between Coke and its customers. Digital signs pair software with high-resolution displays so that a message can be changed instantly based on what the operator wants to communicate or sell. In their Day 3 Keynote at 21st Cloud Expo, Greg Chambers, Global Group Director, Digital Innovation, Coca-Cola, and Vidya Nagarajan, a Senior Product Manager at Google, discussed how from store operations and ...