Welcome!

@CloudExpo Authors: Elizabeth White, Liz McMillan, Pat Romanski, Jason Bloomberg, Shelly Palmer

Related Topics: Cloud Security, Java IoT, Microservices Expo, Linux Containers, @CloudExpo

Cloud Security: Article

Setting the Stage for Cybersecurity with Threat Intelligence

Effective cybersecurity requires an understanding of what assets need to be protected

Ransomware is the latest example of the increasingly sophisticated and damaging inventions of hackers. Individuals and organizations of all sizes are finding that their data has been locked down or encrypted until a ransom is paid. One program, CryptoLocker, infected more than 300,000 computers before the FBI and international law enforcement agencies disabled it. A few days later, Cryptowall showed up to take its place. Companies paid $1.3 billion last year in insurance to help offset the costs of combatting data attacks like these.

Other examples include highly customized malware, advanced persistent threats and large-scale Distributed Denial of Service (DDoS) attacks. Security professionals must remain ever vigilant to both known and new threats on the rise. However, with proper visibility into the extended network and robust intelligence, an attack can often be detected and stopped before it causes significant damage. By using the network to gain intelligence, cyber defenders can gain greater visibility of adversary actions and quickly shut them down.

Since an attack can be broken down into stages, it is helpful to think of a response to an attack in stages as well: before, during and after. This is standard operating procedure for anyone in the security profession. Let's examine each stage:

Before: Cyber defenders are constantly on the lookout for areas of vulnerability. Historically, security had been all about defense. Today, teams are developing more intelligent methods of halting intruders. With total visibility into their environments - including, but not limited, to physical and virtual hosts, operating systems, applications, services, protocols, users, content and network behavior -defenders can take action before an attack has even begun.

During the attack, impact can be minimized if security staff understands what is happening and how to stop it as quickly as possible. They need to be able to continuously address threats, not just at a single point in time. Tools including content inspection, behavior anomaly detection, context awareness of users, devices, location information and applications are critical to understanding an attack as it is occurring. Security teams need to discover where, what and how users are connected to applications and resources.

After the attack, cyber defenders must understand the nature of the attack and how to minimize any damage that may have occurred. Advanced forensics and assessment tools help security teams learn from attacks. Where did the attacker come from? How did they find a vulnerability in the network? Could anything have been done to prevent the breach? More important, retrospective security allows for an infrastructure that can continuously gather and analyze data to create security intelligence. Compromises that would have gone undetected for weeks or months can instead be identified, scoped, contained and remediated in real time or close to it.

The two most important aspects of a defensive strategy, then, are understanding and intelligence. Cybersecurity teams are constantly trying to learn more about who their enemies are, why they are attacking and how. This is where the extended network provides unexpected value: delivering a depth of intelligence that cannot be attained anywhere else in the computing environment. Much like in counterterrorism, intelligence is key to stopping attacks before they happen.

Virtual security, as is sometimes the case in real-world warfare, is often disproportionate to available resources. Relatively small adversaries with limited means can inflict disproportionate damage on larger adversaries. In these unbalanced situations, intelligence is one of the most important assets for addressing threats. But intelligence alone is of little benefit without an approach that optimizes the organizational and operational use of intelligence.

Security teams can correlate identity and context, using network analysis techniques that enable the collection of IP network traffic as it enters or exits an interface, and then add to that threat intelligence and analytics capabilities.

This allows security teams to combine what they learn from multiple sources of information to help identify and stop threats. Sources include what they know from the Web, what they know that's happening in the network and a growing amount of collaborative intelligence gleaned from exchange with public and private entities.

Cryptowall will eventually be defeated, but other ransomware programs and as-yet-unknown attacks will rise to threaten critical data. Effective cybersecurity requires an understanding of what assets need to be protected and an alignment of organizational priorities and capabilities. Essentially, a framework of this type enables security staff to think like malicious actors and therefore do a better job of securing their environments. The security team's own threat intelligence practice, uniting commercial threat information with native analysis of user behavior, will detect, defend against and remediate security events more rapidly and effectively than once thought possible.

More Stories By Greg Akers

Greg Akers is the Senior Vice President of Advanced Security Initiatives and Chief Technology Officer within the Threat Response, Intelligence and Development (TRIAD) group at Cisco. With more than two decades of executive experience, Akers brings a wide range of technical and security knowledge to his current role.

Comments (0)

Share your thoughts on this story.

Add your comment
You must be signed in to add a comment. Sign-in | Register

In accordance with our Comment Policy, we encourage comments that are on topic, relevant and to-the-point. We will remove comments that include profanity, personal attacks, racial slurs, threats of violence, or other inappropriate material that violates our Terms and Conditions, and will block users who make repeated violations. We ask all readers to expect diversity of opinion and to treat one another with dignity and respect.


@CloudExpo Stories
The speed of software changes in growing and large scale rapid-paced DevOps environments presents a challenge for continuous testing. Many organizations struggle to get this right. Practices that work for small scale continuous testing may not be sufficient as the requirements grow. In his session at DevOps Summit, Marc Hornbeek, Sr. Solutions Architect of DevOps continuous test solutions at Spirent Communications, explained the best practices of continuous testing at high scale, which is rele...
SYS-CON Events announced today that MobiDev, a client-oriented software development company, will exhibit at SYS-CON's 20th International Cloud Expo®, which will take place June 6-8, 2017, at the Javits Center in New York City, NY, and the 21st International Cloud Expo®, which will take place October 31-November 2, 2017, at the Santa Clara Convention Center in Santa Clara, CA. MobiDev is a software company that develops and delivers turn-key mobile apps, websites, web services, and complex softw...
DevOps tends to focus on the relationship between Dev and Ops, putting an emphasis on the ops and application infrastructure. But that’s changing with microservices architectures. In her session at DevOps Summit, Lori MacVittie, Evangelist for F5 Networks, will focus on how microservices are changing the underlying architectures needed to scale, secure and deliver applications based on highly distributed (micro) services and why that means an expansion into “the network” for DevOps.
In his session at 19th Cloud Expo, Claude Remillard, Principal Program Manager in Developer Division at Microsoft, contrasted how his team used config as code and immutable patterns for continuous delivery of microservices and apps to the cloud. He showed how the immutable patterns helps developers do away with most of the complexity of config as code-enabling scenarios such as rollback, zero downtime upgrades with far greater simplicity. He also demoed building immutable pipelines in the cloud ...
Using new techniques of information modeling, indexing, and processing, new cloud-based systems can support cloud-based workloads previously not possible for high-throughput insurance, banking, and case-based applications. In his session at 18th Cloud Expo, John Newton, CTO, Founder and Chairman of Alfresco, described how to scale cloud-based content management repositories to store, manage, and retrieve billions of documents and related information with fast and linear scalability. He addres...
Hardware virtualization and cloud computing allowed us to increase resource utilization and increase our flexibility to respond to business demand. Docker Containers are the next quantum leap - Are they?! Databases always represented an additional set of challenges unique to running workloads requiring a maximum of I/O, network, CPU resources combined with data locality.
Due of the rise of Hadoop, many enterprises are now deploying their first small clusters of 10 to 20 servers. At this small scale, the complexity of operating the cluster looks and feels like general data center servers. It is not until the clusters scale, as they inevitably do, when the pain caused by the exponential complexity becomes apparent. We've seen this problem occur time and time again. In his session at Big Data Expo, Greg Bruno, Vice President of Engineering and co-founder of StackIQ...
The cloud market growth today is largely in public clouds. While there is a lot of spend in IT departments in virtualization, these aren’t yet translating into a true “cloud” experience within the enterprise. What is stopping the growth of the “private cloud” market? In his general session at 18th Cloud Expo, Nara Rajagopalan, CEO of Accelerite, explored the challenges in deploying, managing, and getting adoption for a private cloud within an enterprise. What are the key differences between wh...
Security, data privacy, reliability, and regulatory compliance are critical factors when evaluating whether to move business applications from in-house, client-hosted environments to a cloud platform. Quality assurance plays a vital role in ensuring that the appropriate level of risk assessment, verification, and validation takes place to ensure business continuity during the migration to a new cloud platform.
"Tintri was started in 2008 with the express purpose of building a storage appliance that is ideal for virtualized environments. We support a lot of different hypervisor platforms from VMware to OpenStack to Hyper-V," explained Dan Florea, Director of Product Management at Tintri, in this SYS-CON.tv interview at 18th Cloud Expo, held June 7-9, 2016, at the Javits Center in New York City, NY.
Containers have changed the mind of IT in DevOps. They enable developers to work with dev, test, stage and production environments identically. Containers provide the right abstraction for microservices and many cloud platforms have integrated them into deployment pipelines. DevOps and containers together help companies achieve their business goals faster and more effectively. In his session at DevOps Summit, Ruslan Synytsky, CEO and Co-founder of Jelastic, reviewed the current landscape of Dev...
One of the hottest areas in cloud right now is DRaaS and related offerings. In his session at 16th Cloud Expo, Dale Levesque, Disaster Recovery Product Manager with Windstream's Cloud and Data Center Marketing team, will discuss the benefits of the cloud model, which far outweigh the traditional approach, and how enterprises need to ensure that their needs are properly being met.
The security needs of IoT environments require a strong, proven approach to maintain security, trust and privacy in their ecosystem. Assurance and protection of device identity, secure data encryption and authentication are the key security challenges organizations are trying to address when integrating IoT devices. This holds true for IoT applications in a wide range of industries, for example, healthcare, consumer devices, and manufacturing. In his session at @ThingsExpo, Lancen LaChance, vic...
WebRTC has had a real tough three or four years, and so have those working with it. Only a few short years ago, the development world were excited about WebRTC and proclaiming how awesome it was. You might have played with the technology a couple of years ago, only to find the extra infrastructure requirements were painful to implement and poorly documented. This probably left a bitter taste in your mouth, especially when things went wrong.
In their general session at 16th Cloud Expo, Michael Piccininni, Global Account Manager - Cloud SP at EMC Corporation, and Mike Dietze, Regional Director at Windstream Hosted Solutions, reviewed next generation cloud services, including the Windstream-EMC Tier Storage solutions, and discussed how to increase efficiencies, improve service delivery and enhance corporate cloud solution development. Michael Piccininni is Global Account Manager – Cloud SP at EMC Corporation. He has been engaged in t...
You have great SaaS business app ideas. You want to turn your idea quickly into a functional and engaging proof of concept. You need to be able to modify it to meet customers' needs, and you need to deliver a complete and secure SaaS application. How could you achieve all the above and yet avoid unforeseen IT requirements that add unnecessary cost and complexity? You also want your app to be responsive in any device at any time. In his session at 19th Cloud Expo, Mark Allen, General Manager of...
WebRTC is bringing significant change to the communications landscape that will bridge the worlds of web and telephony, making the Internet the new standard for communications. Cloud9 took the road less traveled and used WebRTC to create a downloadable enterprise-grade communications platform that is changing the communication dynamic in the financial sector. In his session at @ThingsExpo, Leo Papadopoulos, CTO of Cloud9, discussed the importance of WebRTC and how it enables companies to focus o...
Big Data engines are powering a lot of service businesses right now. Data is collected from users from wearable technologies, web behaviors, purchase behavior as well as several arbitrary data points we’d never think of. The demand for faster and bigger engines to crunch and serve up the data to services is growing exponentially. You see a LOT of correlation between “Cloud” and “Big Data” but on Big Data and “Hybrid,” where hybrid hosting is the sanest approach to the Big Data Infrastructure pro...
In his General Session at DevOps Summit, Asaf Yigal, Co-Founder & VP of Product at Logz.io, will explore the value of Kibana 4 for log analysis and will give a real live, hands-on tutorial on how to set up Kibana 4 and get the most out of Apache log files. He will examine three use cases: IT operations, business intelligence, and security and compliance. This is a hands-on session that will require participants to bring their own laptops, and we will provide the rest.
"We're bringing out a new application monitoring system to the DevOps space. It manages large enterprise applications that are distributed throughout a node in many enterprises and we manage them as one collective," explained Kevin Barnes, President of eCube Systems, in this SYS-CON.tv interview at DevOps at 18th Cloud Expo, held June 7-9, 2016, at the Javits Center in New York City, NY.