Click here to close now.

Welcome!

CloudExpo® Blog Authors: Liz McMillan, Pat Romanski, Hovhannes Avoyan, Harry Trott, Elizabeth White

Related Topics: BigDataExpo® Blog, Java IoT, Linux Containers, CloudExpo® Blog, Cloud Security, SDN Journal

BigDataExpo® Blog: Article

Security Through Data Analytics

The best way to protect the infrastructure, the brand and the consumer

Given the mountains of data now floating around, it is perhaps inevitable that the very function of data analytics is seen as somehow intrusive. There's a constant glut of reports, columns and other stories bemoaning the lack of data privacy - and at times, they're entirely justified. Organizations have a solemn duty to protect their customers' data, and those that don't implement the proper safeguards deserve to be vilified.

But beneath the surface lurks another dimension of this discussion that is often overlooked. Ethical and effective data analytics enhances security. Ethical and effective data analytics protects not only the institutions that possess the data, but also the consumers that data reflects. Ethical and effective data analytics serves a good purpose.

Let's be clear about the parameters of this argument. Data doesn't exist in a vacuum - it's generated on an ongoing basis through multiple activities, created in numerous formats and comes in through a variety of channels. At any given time, it is being analyzed and used (and occasionally misused) to serve many different needs.

Of course, when done right, information services and analytics represent a key driver of most business decisions. Actionable intelligence based on real data doesn't just augment gut instinct; it leads to quantitative thinking that supports strategic initiatives, enables tactical outreach and boosts the bottom line. Perhaps most important, it enhances information security so as to protect customer privacy and prevent operational and brand damage.

High-profile assaults on retailers like Target and Neiman Marcus, or clandestine downloads of classified information from the National Security Administration (NSA), make more news than inside-the-infrastructure DDoS attacks, but the latter is even more insidious. There are over 2,000 DDoS attacks every day. Some 65 percent of all organizations see three or more attacks each year. While the devastation is certainly felt on an organizational level, the financial impact is just as significant: DDoS attacks can cost up to $100K an hour.

DDoS mitigation can be an enormous challenge. Making an accurate distinction between normal, benign Internet traffic and malicious activity that could be the root cause of a potential DDoS attack is critical, but it's not easy. This is in part because DDoS attacks, especially when they serve as the front line of advanced targeted attacks, are remarkably sophisticated. They rely on stealth techniques that go unnoticed within the enterprise for long periods. They're highly customized, based specifically on each target's infrastructure and defenses, and can often defeat defense strategies that rely primarily on signature-based detection. Then of course there's the cloud. When attacks become super-sized, the defensive strategies in place must have the capacity to scrub far larger volumes of bad traffic.

This is why information services and analytics are so crucial. They can boost awareness and reaction time to particular situations. When it comes to leveraging Big Data within the enterprise to help identify breach attempts, it's still early days. According to a January 2014 report from Gartner, eight percent of enterprises are using data and analytics to identify security flaws. But there's reason for optimism - the same report also estimates that within the next two years, around 25 percent of enterprises will leverage Big Data for security purposes.

It is this same pattern-searching approach that the enterprise should take when it comes to DDoS mitigation. Proactive site monitoring on a continuous basis - in particular with a centralized view of traffic patterns - enables organizations to identify anomalies and threats, before they become real problems. For example, in the case of a custom application being exploited for a directed attack to steal customer data, the detection solution must be able to identify and highlight the fact that there's a new kind of application traffic on the network.

This might be a new concept to enforce at the enterprise level, but this is really something that banks have been doing for years with regard to fraud protection services. Banks monitor a person's transaction activity, and when a purchase is made that does not fit the usual spending behavior, it is stopped and flagged with the customer. The same thing should - and will - happen at the enterprise level.

It's easy to see why information services and analytics are too often seen as a potential invasion of privacy. Data privacy is vital, and it should rightfully be a corporate priority. However, in the ongoing effort to secure data, the right kind of analytics can be the best weapon of all.

More Stories By Mark Bregman

Mark F. Bregman is Senior Vice President and Chief Technology Officer at Neustar. He joined the Neustar executive team in August 2011 and is responsible for Neustar’s product technology strategy and product development efforts.

Prior to joining Neustar, Dr. Bregman was Executive Vice President and Chief Technology Officer of Symantec since 2006. His portfolio while CTO of Symantec Corporation included developing the company’s technology strategy and overseeing its investments in advanced research and development, security and technology services.

Prior to Symantec, Dr. Bregman served as Executive Vice President, Product Operations at Veritas Corporation, which merged with Symantec in 2005. Prior to Veritas, he was CEO of AirMedia, an early mobile content marketplace, and spent 16 years in a variety of roles at IBM. Dr. Bregman serves on the Board of the Bay Area Science & Innovation Consortium and the Anita Borg Institute, which focuses on increasing the impact of women on all aspects of technology.

Comments (0)

Share your thoughts on this story.

Add your comment
You must be signed in to add a comment. Sign-in | Register

In accordance with our Comment Policy, we encourage comments that are on topic, relevant and to-the-point. We will remove comments that include profanity, personal attacks, racial slurs, threats of violence, or other inappropriate material that violates our Terms and Conditions, and will block users who make repeated violations. We ask all readers to expect diversity of opinion and to treat one another with dignity and respect.


@CloudExpo Stories
Container technology is sending shock waves through the world of cloud computing. Heralded as the 'next big thing,' containers provide software owners a consistent way to package their software and dependencies while infrastructure operators benefit from a standard way to deploy and run them. Containers present new challenges for tracking usage due to their dynamic nature. They can also be deployed to bare metal, virtual machines and various cloud platforms. How do software owners track the usag...
If cloud computing benefits are so clear, why have so few enterprises migrated their mission-critical apps? The answer is often inertia and FUD. No one ever got fired for not moving to the cloud - not yet. In his session at 15th Cloud Expo, Michael Hoch, SVP, Cloud Advisory Service at Virtustream, discussed the six key steps to justify and execute your MCA cloud migration.
SYS-CON Events announced today that BMC will exhibit at SYS-CON's 16th International Cloud Expo®, which will take place on June 9-11, 2015, at the Javits Center in New York City, NY. BMC delivers software solutions that help IT transform digital enterprises for the ultimate competitive business advantage. BMC has worked with thousands of leading companies to create and deliver powerful IT management services. From mainframe to cloud to mobile, BMC pairs high-speed digital innovation with robust...
Containers Expo Blog covers the world of containers, as this lightweight alternative to virtual machines enables developers to work with identical dev environments and stacks. Containers Expo Blog offers top articles, news stories, and blog posts from the world's well-known experts and guarantees better exposure for its authors than any other publication. Bookmark Containers Expo Blog ▸ Here Follow new article posts on Twitter at @ContainersExpo
Compute virtualization has been transformational, yet security policy implementation and enforcement has lagged behind in agility and automation. There are a number of key considerations when implementing policy in private and hybrid clouds. In his session at 15th Cloud Expo, Malcolm Rieke, the Director of Product Management at Catbird, discussed the impact of this new paradigm and what organizations can do today to safely move to software-defined network and compute architectures, including: ...
Cloud and Big Data present unique dilemmas: embracing the benefits of these new technologies while maintaining the security of your organization's assets. When an outside party owns, controls and manages your infrastructure and computational resources, how can you be assured that sensitive data remains private and secure? How do you best protect data in mixed use cloud and big data infrastructure sets? Can you still satisfy the full range of reporting, compliance and regulatory requirements? In...
In this scenarios approach Joe Thykattil, Technology Architect & Sales at TimeWarner / Navisite, presented examples that will allow business-savvy professionals to make informed decisions based on a sound business model. This model covered the technology options in detail as well as a financial analysis. The TCO (Total Cost of Ownership) and ROI (Return on Investment) demonstrated how to start, develop and formulate a business case that will allow both small and large scale projects to achieve...
The move to the cloud brings a number of new security challenges, but the application remains your last line of defense. In his session at 15th Cloud Expo, Arthur Hicken, Evangelist at Parasoft, discussed how developers are extremely well-poised to perform tasks critical for securing the application – provided that certain key obstacles are overcome. Arthur Hicken has been involved in automating various practices at Parasoft for almost 20 years. He has worked on projects including database dev...
Building low-cost wearable devices can enhance the quality of our lives. In his session at Internet of @ThingsExpo, Sai Yamanoor, Embedded Software Engineer at Altschool, provided an example of putting together a small keychain within a $50 budget that educates the user about the air quality in their surroundings. He also provided examples such as building a wearable device that provides transit or recreational information. He then reviewed the resources available to build wearable devices at ...
There has been a lot of discussion recently in the DevOps space over whether there is a unique form of DevOps for large enterprises or is it just vendors looking to sell services and tools. In his session at DevOps Summit, Chris Riley, a technologist, discussed whether Enterprise DevOps is a unique species or not. What makes DevOps adoption in the enterprise unique or what doesn’t? Unique or not, what does this mean for adopting DevOps in enterprise size organizations? He also explored differe...
Storage administrators find themselves walking a line between meeting employees’ demands to use public cloud storage services, and their organizations’ need to store information on-premises for security, performance, cost and compliance reasons. However, as file sharing protocols like CIFS and NFS continue to lose their relevance, simply relying only on a NAS-based environment creates inefficiencies that hurt productivity and the bottom line. IT wants to implement cloud storage it can purchase a...
The emergence of cloud computing and Big Data warrants a greater role for the PMO to successfully manage enterprise transformation driven by these powerful trends. As the adoption of cloud-based services continues to grow, a governance model is needed to orchestrate enterprise cloud implementations and harness the power of Big Data analytics. In his session at Cloud Expo, Mahesh Singh, President of BigData, Inc., discussed how the Enterprise PMO takes center stage not only in developing the app...
Cloud Foundry open Platform as a Service makes it easy to operate, scale and deploy application for your dedicated cloud environments. It enables developers and operators to be significantly more agile, writing great applications and deliver them in days instead of months. Cloud Foundry takes care of all the infrastructure and network plumbing that you need to build, run and operate your applications and can do this while patching and updating systems and services without any downtime.
Are your Big Data initiatives resulting in Big Impact or Big Mess? In her session at Big Data Expo, Penelope Everall Gordon, Emerging Technology Strategist at 1Plug Corporation, shared her successes in improving Big Decision outcomes by building stories compelling to the target audience – and her failures when she lost sight of the plotline, distracted by the glitter of technology and the lure of buried insights. The cast of characters includes the agency head [city official? elected official?...
In their session at @ThingsExpo, Shyam Varan Nath, Principal Architect at GE, and Ibrahim Gokcen, who leads GE's advanced IoT analytics, focused on the Internet of Things / Industrial Internet and how to make it operational for business end-users. Learn about the challenges posed by machine and sensor data and how to marry it with enterprise data. They also discussed the tips and tricks to provide the Industrial Internet as an end-user consumable service using Big Data Analytics and Industrial C...
After a couple of false starts, cloud-based desktop solutions are picking up steam, driven by trends such as BYOD and pervasive high-speed connectivity. In his session at 15th Cloud Expo, Seth Bostock, CEO of IndependenceIT, cut through the hype and the acronyms, and discussed the emergence of full-featured cloud workspaces that do for the desktop what cloud infrastructure did for the server. He also discussed VDI vs DaaS, implementation strategies and evaluation criteria.
We certainly live in interesting technological times. And no more interesting than the current competing IoT standards for connectivity. Various standards bodies, approaches, and ecosystems are vying for mindshare and positioning for a competitive edge. It is clear that when the dust settles, we will have new protocols, evolved protocols, that will change the way we interact with devices and infrastructure. We will also have evolved web protocols, like HTTP/2, that will be changing the very core...
How do APIs and IoT relate? The answer is not as simple as merely adding an API on top of a dumb device, but rather about understanding the architectural patterns for implementing an IoT fabric. There are typically two or three trends: Exposing the device to a management framework Exposing that management framework to a business centric logic Exposing that business layer and data to end users. This last trend is the IoT stack, which involves a new shift in the separation of what stuff happe...
Connected devices and the Internet of Things are getting significant momentum in 2014. In his session at Internet of @ThingsExpo, Jim Hunter, Chief Scientist & Technology Evangelist at Greenwave Systems, examined three key elements that together will drive mass adoption of the IoT before the end of 2015. The first element is the recent advent of robust open source protocols (like AllJoyn and WebRTC) that facilitate M2M communication. The second is broad availability of flexible, cost-effective ...
Collecting data in the field and configuring multitudes of unique devices is a time-consuming, labor-intensive process that can stretch IT resources. Horan & Bird [H&B], Australia’s fifth-largest Solar Panel Installer, wanted to automate sensor data collection and monitoring from its solar panels and integrate the data with its business and marketing systems. After data was collected and structured, two major areas needed to be addressed: improving developer workflows and extending access to a b...