|By Kevin Nikkhoo||
|December 3, 2012 08:15 AM EST||
For all the right reasons, your company has been thinking about deploying SIEM…to create an alert system when those with less than good intentions come knocking; to remediate potential network threats; to comply with federal, state or industry regulations; and identify the risks and vulnerabilities throughout the enterprise IT infrastructure and architecture. If you maintain even a modest (SMB -> Fortune 1000) organization that has any online identity, SIEM should be the cornerstone of your asset protection strategy.
First and foremost, SIEM (and to a certain extent log management) is about visibility. Who is doing what and when on your network. It is as much about understanding the holistic landscape of your infrastructure as it is protecting proprietary assets. Without it, it’s akin to coaching the Big Game without any idea who is the opponent; or for that matter if you even have a starting left guard.
But fun metaphors aside, SIEM is a critical enterprise tool. And just like any enterprise solution, it requires forethought, vigilance and most importantly, a good game plan. And when deployed properly it can change your IT department from infrastructure-based, to information-centric. And as such you get to make better decisions, faster.
And with every technology there are best practices and pitfalls. In past articles I have spoken at length regarding the advantages of deploying and managing SIEM in the cloud. Many of these surround the affordability, manageability, control and capability of the solution. For many, security from the cloud is still an emerging concept. But for those who’ve already made the leap, they are reaping the significant benefits. But I want to move beyond the arguments of “going cloud” when deciding on security solutions. Today I want to focus on what happens next. How do you start collecting that ROI once a cloud-based security-as-a-service has been chosen?
The reason most enterprise deployments fail (on premise or cloud) can be typically traced to two causes: (1.) Lack of buy-in from the executive level or employee resistance to change, but more often the culprit is (2.) lack of vision or process. Too many companies jump in and apply a solution because they heard it was important or were sold a Porsche when all they needed was a family SUV. Of course one of the benefits of cloud-based security is the ability to "buy" the SUV and instantly scale up to that Porsche, if and when, the business need requires it (without touching CapEx budgets!)! But with that here are 8 best practices you should implement when moving forward with your cloud-based security initiative:
Best Practice #1: Identify your goals and match your scope to them. There are five questions you need to ask before moving forward with any deployment. 1. WHY do you need SIEM (compliance? user and/or partner expansion? BYOD? Breach detection?) HOW will SIEM be deployed to properly address these issues (what processes, functionality and capabilities are needed; which needs to be outsourced/replaced/improved) WHAT needs to be collected, analyzed and reported? HOW BIG does the deployment need to scale to accurately and cost effectively meet your specific business need? And WHERE is the information situated that should/must be monitored?
Best practice #2: Incremental usage. The quickest route to success is taking baby steps. The idea is to prove the concept and then expand the scope. To some this might be to start with log management and add SIEM once you understand the requirements, commitment and volume. Now because security-as-a-service is so flexible and can ramp up or down instantly, an easy entry point might be to start with only those elements that fulfill compliance. The project might be overwhelming, but if you take it in bite-sized phases, you will find the victories come easier and the ROI is justified. When dealing with a cloud security deployment, it is easy to turn on the fire hose when only a garden hose is needed. But the beauty of a cloud deployment is the ease and flexibility of scaling. Again, another example of incremental usage would be either to apply SIEM against specific use case scenarios or possibly just migrate a division or a department or a function (as opposed to the entire enterprise).
Best Practice #3: Determine what IS and ISN’T a threat to your network. Returning to the fire hose metaphor, when deploying a SIEM initiative, it is very easy to get lost in a sea of data. It can be like trying to drink from that proverbial fire hose. The trick is to recognize what constitutes a true risk and eliminate false positives. And this requires some internal analysis to create a series of rules that sift out the white noise and differentiate “normal” traffic from suspicious activity. For instance, if there is an attempted access to your partner portal from Russia—is that normal? Do you even have a partner in Minsk? But even a simple filter isn’t quite enough. Risk is three dimensional and it can hide in plain sight. That’s why you continue to filter based on time of day, IP address, server, attempts, network availability and a myriad of other forensic qualifiers before the alert is grave enough to require immediate attention.
Best practice #4: Map response plans. Now that an incident gets your attention, what do you do? Do you launch an account investigation, suspend the user, deactivate a password, apply a denial-of-service against the IP or a number of remediations based on the severity, vulnerability and identity of the transgressor. This goes back to workflow and process. Who is going to what to whom and how? SIEM is a process-reliant technology. You simply can’t flip a switch and say you’ve put up a magic forcefield around your network. Your response plan is your blueprint to closing the vulnerability gaps and ensuring compliance.
Best practice #5 Correlate data from multiple sources. The practice of situational awareness is what adds the muscle into a SIEM initiative. Like #4, it isn’t enough to plug in a solution and press “go.” Situational awareness takes into account a multitude of different endpoints, servers, data streams, assets and inventories, events and flows, from across the enterprise and puts information into context. Context is the most important portion of risk assessment. For example, a shark is a threat. However if that shark is 10 miles away, it is not a direct or immediate threat. Doesn't mean you're not vulnerable if that shark gets hungry. Having an engine that not only creates accurate perspective, but analyzes, understands and acts upon behaviors is key. And to do that a centralized SIEM engine needs the data from more than just a single source or single server.
Best Practice #6: Requires Real time monitoring 7/24/365. For many companies this is a challenge, but hackers don’t sleep. And although a great deal of SIEM and Log Management is automated, it still requires the vigilance of 24 hour monitoring. Trees might be falling in the forest, but if there is no one to see them, breaches occur, networks are compromised. I’ve witnessed plenty of IT departments that don’t have the resources. Again, this is a considerable advantage that security-as-s-service provides and allows you to sleep just a little better at night. Knowing that this one crucial element of your security is professionally addressed without additional staff or budget makes the cloud that much more valuable.
Best Practice #7 Remain calm! One thing we’ve noticed is that soon after the deployment of a SIEM/Log Management it seems there are alerts and issues you never dreamed about. Things are bound to look worse before they get better and it can seem overwhelming; kind of opening a Pandora’s Box of malware and botnets. For the most part it is because you now know what you didn’t know before. In some respect it is like looking at your hotel room comforter under black light and a microscope. But once you realize what you’re looking at and that much or the remediation can be automated, soon, (with a bit of fine tuning and normalizing correlation feeds) you will be measure that the anomalous events lessen and the alert prioritizations allow you to make timely and intelligent decisions.
Best practice #8: Evolution. Security is a moving target. You need to revisit you processes and workflows every few months to make sure you are up to date with compliance requirements, new users/access points and expanded or redefined workflows. This is more than recognizing the latest virus threats. New users access your network with regularity. New layers of regulations are added. There are new applications requiring monitoring. All in all, by giving your cloud-based SIEM and log management solutions the new and necessary data, your enterprise will be more secure than it was yesterday.
A completely new computing platform is on the horizon. They’re called Microservers by some, ARM Servers by others, and sometimes even ARM-based Servers. No matter what you call them, Microservers will have a huge impact on the data center and on server computing in general. Although few people are familiar with Microservers today, their impact will be felt very soon. This is a new category of computing platform that is available today and is predicted to have triple-digit growth rates for some ...
Oct. 22, 2016 11:00 PM EDT Reads: 33,925
Effectively SMBs and government programs must address compounded regulatory compliance requirements. The most recent are Controlled Unclassified Information and the EU’s GDPR have Board Level implications. Managing sensitive data protection will likely result in acquisition criteria, demonstration requests and new requirements. Developers, as part of the pre-planning process and the associated supply chain, could benefit from updating their code libraries and design by incorporating changes.
Oct. 22, 2016 10:45 PM EDT Reads: 1,612
November 1–3, 2016, at the Santa Clara Convention Center in Santa Clara, CA. Penta Security is a leading vendor for data security solutions, including its encryption solution, D’Amo. By using FPE technology, D’Amo allows for the implementation of encryption technology to sensitive data fields without modification to schema in the database environment. With businesses having their data become increasingly more complicated in their mission-critical applications (such as ERP, CRM, HRM), continued ...
Oct. 22, 2016 10:30 PM EDT Reads: 909
SYS-CON Events announced today that Cloudbric, a leading website security provider, will exhibit at the 19th International Cloud Expo, which will take place on November 1–3, 2016, at the Santa Clara Convention Center in Santa Clara, CA. Cloudbric is an elite full service website protection solution specifically designed for IT novices, entrepreneurs, and small and medium businesses. First launched in 2015, Cloudbric is based on the enterprise level Web Application Firewall by Penta Security Sys...
Oct. 22, 2016 10:30 PM EDT Reads: 1,045
SYS-CON Events announced today that SoftNet Solutions will exhibit at the 19th International Cloud Expo, which will take place on November 1–3, 2016, at the Santa Clara Convention Center in Santa Clara, CA. SoftNet Solutions specializes in Enterprise Solutions for Hadoop and Big Data. It offers customers the most open, robust, and value-conscious portfolio of solutions, services, and tools for the shortest route to success with Big Data. The unique differentiator is the ability to architect and ...
Oct. 22, 2016 10:15 PM EDT Reads: 641
Most people haven’t heard the word, “gamification,” even though they probably, and perhaps unwittingly, participate in it every day. Gamification is “the process of adding games or game-like elements to something (as a task) so as to encourage participation.” Further, gamification is about bringing game mechanics – rules, constructs, processes, and methods – into the real world in an effort to engage people. In his session at @ThingsExpo, Robert Endo, owner and engagement manager of Intrepid D...
Oct. 22, 2016 09:30 PM EDT Reads: 9,652
WebRTC adoption has generated a wave of creative uses of communications and collaboration through websites, sales apps, customer care and business applications. As WebRTC has become more mainstream it has evolved to use cases beyond the original peer-to-peer case, which has led to a repeating requirement for interoperability with existing infrastructures. In his session at @ThingsExpo, Graham Holt, Executive Vice President of Daitan Group, will cover implementation examples that have enabled ea...
Oct. 22, 2016 09:00 PM EDT Reads: 2,251
SYS-CON Events announced today that Enzu will exhibit at the 19th International Cloud Expo, which will take place on November 1–3, 2016, at the Santa Clara Convention Center in Santa Clara, CA. Enzu’s mission is to be the leading provider of enterprise cloud solutions worldwide. Enzu enables online businesses to use its IT infrastructure to their competitive advantage. By offering a suite of proven hosting and management services, Enzu wants companies to focus on the core of their online busine...
Oct. 22, 2016 09:00 PM EDT Reads: 1,252
@DevOpsSummit has been named the ‘Top DevOps Influencer' by iTrend. iTrend processes millions of conversations, tweets, interactions, news articles, press releases, blog posts - and extract meaning form them and analyzes mobile and desktop software platforms used to communicate, various metadata (such as geo location), and automation tools. In overall placement, @DevOpsSummit ranked as the number one ‘DevOps Influencer' followed by @CloudExpo at third, and @MicroservicesE at 24th.
Oct. 22, 2016 09:00 PM EDT Reads: 3,893
Enterprise IT has been in the era of Hybrid Cloud for some time now. But it seems most conversations about Hybrid are focused on integrating AWS, Microsoft Azure, or Google ECM into existing on-premises systems. Where is all the Private Cloud? What do technology providers need to do to make their offerings more compelling? How should enterprise IT executives and buyers define their focus, needs, and roadmap, and communicate that clearly to the providers?
Oct. 22, 2016 08:45 PM EDT Reads: 2,470
In the next forty months – just over three years – businesses will undergo extraordinary changes. The exponential growth of digitization and machine learning will see a step function change in how businesses create value, satisfy customers, and outperform their competition. In the next forty months companies will take the actions that will see them get to the next level of the game called Capitalism. Or they won’t – game over. The winners of today and tomorrow think differently, follow different...
Oct. 22, 2016 08:45 PM EDT Reads: 853
SYS-CON Events announced today that Roundee / LinearHub will exhibit at the WebRTC Summit at @ThingsExpo, which will take place on November 1–3, 2016, at the Santa Clara Convention Center in Santa Clara, CA. LinearHub provides Roundee Service, a smart platform for enterprise video conferencing with enhanced features such as automatic recording and transcription service. Slack users can integrate Roundee to their team via Slack’s App Directory, and '/roundee' command lets your video conference ...
Oct. 22, 2016 08:45 PM EDT Reads: 2,004
SYS-CON Events announced today that Sheng Liang to Keynote at SYS-CON's 19th Cloud Expo, which will take place on November 1-3, 2016 at the Santa Clara Convention Center in Santa Clara, California.
Oct. 22, 2016 08:30 PM EDT Reads: 1,673
So you think you are a DevOps warrior, huh? Put your money (not really, it’s free) where your metrics are and prove it by taking The Ultimate DevOps Geek Quiz Challenge, sponsored by DevOps Summit. Battle through the set of tough questions created by industry thought leaders to earn your bragging rights and win some cool prizes.
Oct. 22, 2016 08:15 PM EDT Reads: 3,786
More and more brands have jumped on the IoT bandwagon. We have an excess of wearables – activity trackers, smartwatches, smart glasses and sneakers, and more that track seemingly endless datapoints. However, most consumers have no idea what “IoT” means. Creating more wearables that track data shouldn't be the aim of brands; delivering meaningful, tangible relevance to their users should be. We're in a period in which the IoT pendulum is still swinging. Initially, it swung toward "smart for smar...
Oct. 22, 2016 08:15 PM EDT Reads: 689
DevOps is being widely accepted (if not fully adopted) as essential in enterprise IT. But as Enterprise DevOps gains maturity, expands scope, and increases velocity, the need for data-driven decisions across teams becomes more acute. DevOps teams in any modern business must wrangle the ‘digital exhaust’ from the delivery toolchain, "pervasive" and "cognitive" computing, APIs and services, mobile devices and applications, the Internet of Things, and now even blockchain. In this power panel at @...
Oct. 22, 2016 07:45 PM EDT Reads: 1,821
"Matrix is an ambitious open standard and implementation that's set up to break down the fragmentation problems that exist in IP messaging and VoIP communication," explained John Woolf, Technical Evangelist at Matrix, in this SYS-CON.tv interview at @ThingsExpo, held Nov 4–6, 2014, at the Santa Clara Convention Center in Santa Clara, CA.
Oct. 22, 2016 06:00 PM EDT Reads: 8,969
In past @ThingsExpo presentations, Joseph di Paolantonio has explored how various Internet of Things (IoT) and data management and analytics (DMA) solution spaces will come together as sensor analytics ecosystems. This year, in his session at @ThingsExpo, Joseph di Paolantonio from DataArchon, will be adding the numerous Transportation areas, from autonomous vehicles to “Uber for containers.” While IoT data in any one area of Transportation will have a huge impact in that area, combining sensor...
Oct. 22, 2016 05:30 PM EDT Reads: 494
Established in 1998, Calsoft is a leading software product engineering Services Company specializing in Storage, Networking, Virtualization and Cloud business verticals. Calsoft provides End-to-End Product Development, Quality Assurance Sustenance, Solution Engineering and Professional Services expertise to assist customers in achieving their product development and business goals. The company's deep domain knowledge of Storage, Virtualization, Networking and Cloud verticals helps in delivering ...
Oct. 22, 2016 05:30 PM EDT Reads: 976
Extreme Computing is the ability to leverage highly performant infrastructure and software to accelerate Big Data, machine learning, HPC, and Enterprise applications. High IOPS Storage, low-latency networks, in-memory databases, GPUs and other parallel accelerators are being used to achieve faster results and help businesses make better decisions. In his session at 18th Cloud Expo, Michael O'Neill, Strategic Business Development at NVIDIA, focused on some of the unique ways extreme computing is...
Oct. 22, 2016 04:00 PM EDT Reads: 3,848