Welcome!

@CloudExpo Authors: Pat Romanski, Jason Bloomberg, Elizabeth White, Kong Yang, John Rauser

Related Topics: @CloudExpo, Microservices Expo, Containers Expo Blog, Cloud Security, @BigDataExpo, SDN Journal

@CloudExpo: Article

Brass Tacks: Answering the Cloud Security Questions That Matter

Who is logging in? What is accessed? When was it changed? And more!

Enterprise security can be a labyrinthine, complex beast with many moving parts, dozen upon dozen of requirements, needs, implications, options and alternatives.

But when we get down to the nitty gritty (the brass tacks if you will), cloud security can be simplified by six simple questions:

WHO is logging in?

WHAT are they accessing/viewing?

WHERE is the device from which that person logs in?

WHEN was any asset changed/modified/moved

HOW are they authorized/credentialed?

WHAT is the impact of the event?

Now determining the answers to those questions might require a bit of coordination, but in terms of initiative and priority, it is the answers to the above questions that must drive any enterprise security initiative.

The concept of enterprise security is simple. Allow those who you want to see and access data in, and everyone else out. Of course the addendum to that is those that are on the “inside” cannot /should not distribute information outside the “Circle of Trust.” Now note I said the concept is simple. The application of those concepts is much more complex. It requires the coordination and integration of several features from several different solutions. If you are a CFO, I can see you rolling your eyes…how much is this going to cost (the answer is not as much as you think if deploying cloud-based solutions)?! If you are a CTO, I can hear that you’ve got it covered. If so, great, but there are thousands upon thousands of companies who don’t or only have a portion of the equation adequately addressed. And the biggest issue with the latter is that too many professionals mistakenly think their portion is doing the whole job.

In journalism class, I learned that the 5W’s and an H (who, what, when, where, why and how) are what every news story needs. Extrapolated, it’s what every enterprise security initiative needs as well.  If you can answer these questions, then you’re definitely on the right path to securing your assets as well as the privacy of your users. Yet, the key to answering these questions is having certain solutions and policies in place. Beyond that, you need those access management solutions and policies to “talk” to each other. It’s no longer good enough to have user provisioning if it doesn’t communicate with intrusion detection (SIEM/Log Management) or single sign on (access management). That brings us to our first question:

Who is logging in?: You can implement all industry standards, follow all regulatory frameworks and best practices, but if you're sloppy from the start with verifying identities, maintaining credentials or parsing the data,  then your security is no better than blind man’s bluff. Let’s say Ms. Jones leaves your employ. If you don’t have user provisioning, it may take days or weeks (if ever) to remove her permissions from network access protocols. Without intrusion detection, you’ll not know if she's trying to access her account after her employment relationship ends. But assume she's an honest person and that your company de-provisioned her accounts. It still doesn’t mean someone isn’t trying to breach using her credentials. How will you know? If an alert policy is set to notify someone when an attempt for access is tried against a dormant or retired account (or three failed log-in tries), you can immediately trace it to the IP source and remediate the issue.  Thing of it is, if you think whoever is attempting a breach is limited to just Ms. Jones’ former account, you’re sadly mistaken. By integrating single sign on, identity management and SIEM, you get to know exactly who is logging in (with a record for compliance) and those without the proper credentials are left on the outside looking in.

What are they accessing? Because a user provides the proper user name and password combo, shouldn’t give them keys to the entire kingdom. Think of what would happen if all the currency plates or gold reserves were held in the same building. Anyone with a key and a wheelbarrow could be wealthy beyond the dreams of avarice. The same needs to be considered for your IT assets, your proprietary sensitive data and personal information. What files, data and applications are necessary for a person to do their or job, place an order securely? Multi-authenticated access opens the door, but it is role-based provisioning that limits what they can see based on the needs of the person and the permission policies ascribed to them. Does shipping & receiving need human resource applications to send Product X to a customer?  In this scenario, integrating single sign on with identity management creates the necessary automation, but also creates the compliance information which is automatically compiled for reporting. That’s something I call Enterprise Access Control. The next step is ensuring the data is properly encrypted…but that‘s a blog for another day.

Where is the device? You want to know that the activity is coming from validated IP addresses. For example…Carl is your salesman and spent the night in a hotel in Wichita. He logged in and uploaded some expense reports. No problem. He had the right credentials. He accessed portions on the ERP that are based on his role permissions. Three hours later, Carl’s account is accessed again; had the right user name and password, but this time the SIEM system picks up that the IP address is coming from Romania. Well unless Carl has some superhuman power or used a Star Trek transporter, it isn’t him. With the significant rise in BYOD (tablets, iPhones, laptops, etc…), you need to make sure you know where the access is coming from. Without profiling, you know that certain areas of the globe are more likely to be engaged in hacking.

When was the asset changed? Much like “where,” when certain applications or data are accessed provides clues as to whether an event is suspicious or benign. By applying adaptive risk policies (predictive behavior protocols), you can plot when certain IP addresses log on.  Does it raise any red flags that attempts on certain accounts or applications are happening at 3 in the morning? In it of itself, probably not. But if you apply context from various data deposits, the picture becomes much clearer. If you are in Chicago, 3am is 9am in London. If you have customers or partners in London, there’s probably no cause for alarm. However, if the log on is to an application that a partner would have no need of, then the time of the attempt is very relevant. However, to correlate the appropriate context, your security functions need to be centralized and collaborating 24/7/365.

How are they credentialed? I’ve addressed roles-based user provisioning, application access restriction, but the last part of the policy is ensuring that multi-factored authentication is applied. In short, the best practice is to create security questions that cannot be easily found in any other digital forum (like birthdays, hire dates, names of family). The other half is to insist on strong passwords that include letters, numbers and symbols. Recognizing that a single password has a certain likelihood of being cracked, authenticating a user on two of these factors makes it one step harder for someone to exploit your system. Multi-factor authentication will make it substantially more difficult for an attacker to gain access but it’s certainly not going to stop a determined attacker. This is again why I insist on a layered and integrated security infrastructure.

What is the impact? Not all events require your IT SWAT team. Vulnerability scans using situational context allow you to separate the white noise and the everyday routine from suspicious events. Again it is the flow of data from various silos and applications that make it more security more effective and allow you to determine the threat level of any particular cluster of activity.

But the reality is that answering all the questions takes investment. Beyond dollars, it takes investment of time, resources, expertise and infrastructure. The overarching problem is not every company has that luxury or an excess of expenditures to apply against something that is looked upon as a cost center. If for no other reason than that, companies must look towards the cloud as a provider of security solutions. Cloud-based security can be seen as the great equalizer. Whether it is adding new capabilities or leveraging what you already have in house. It centralizes all the tools, all the expertise as a holistic security-as-a-service that broadens the reach and scope of enterprise monitoring, strengthens access authentication, and satisfies regulatory compliance. Because of the cost and resource savings It changes the dynamic for an initiative to performance rather than limited scope.

To answer all the questions posed above, the most important element a security initiative must have is visibility. Without coordinating/correlating the data acquired from independent security initiatives, you are wasting resources, potentially duplicating efforts and not necessarily seeing the whole picture.

If the application of security were simple, then everybody would recognize Superman was Clark Kent. Are you trying to tell me that a simple pair of glasses can fool everybody? So it really is all about context. You don’t expect to see Superman as Clark Kent, so you don’t. However if you centrally put all the information in context, then you would see the truth. That is why it remains ever so important to continue to ask the questions of your security initiative…who, what, where, when and how?

Kevin Nikkhoo

CloudAccess

More Stories By Kevin Nikkhoo

With more than 32 years of experience in information technology, and an extensive and successful entrepreneurial background, Kevin Nikkhoo is the CEO of the dynamic security-as-a-service startup Cloud Access. CloudAccess is at the forefront of the latest evolution of IT asset protection--the cloud.

Kevin holds a Bachelor of Science in Computer Engineering from McGill University, Master of Computer Engineering at California State University, Los Angeles, and an MBA from the University of Southern California with emphasis in entrepreneurial studies.

@CloudExpo Stories
Multiple data types are pouring into IoT deployments. Data is coming in small packages as well as enormous files and data streams of many sizes. Widespread use of mobile devices adds to the total. In this power panel at @ThingsExpo, moderated by Conference Chair Roger Strukhoff, panelists looked at the tools and environments that are being put to use in IoT deployments, as well as the team skills a modern enterprise IT shop needs to keep things running, get a handle on all this data, and deliver...
In his session at @ThingsExpo, Eric Lachapelle, CEO of the Professional Evaluation and Certification Board (PECB), provided an overview of various initiatives to certify the security of connected devices and future trends in ensuring public trust of IoT. Eric Lachapelle is the Chief Executive Officer of the Professional Evaluation and Certification Board (PECB), an international certification body. His role is to help companies and individuals to achieve professional, accredited and worldwide re...
Both SaaS vendors and SaaS buyers are going “all-in” to hyperscale IaaS platforms such as AWS, which is disrupting the SaaS value proposition. Why should the enterprise SaaS consumer pay for the SaaS service if their data is resident in adjacent AWS S3 buckets? If both SaaS sellers and buyers are using the same cloud tools, automation and pay-per-transaction model offered by IaaS platforms, then why not host the “shrink-wrapped” software in the customers’ cloud? Further, serverless computing, cl...
You know you need the cloud, but you’re hesitant to simply dump everything at Amazon since you know that not all workloads are suitable for cloud. You know that you want the kind of ease of use and scalability that you get with public cloud, but your applications are architected in a way that makes the public cloud a non-starter. You’re looking at private cloud solutions based on hyperconverged infrastructure, but you’re concerned with the limits inherent in those technologies.
Wooed by the promise of faster innovation, lower TCO, and greater agility, businesses of every shape and size have embraced the cloud at every layer of the IT stack – from apps to file sharing to infrastructure. The typical organization currently uses more than a dozen sanctioned cloud apps and will shift more than half of all workloads to the cloud by 2018. Such cloud investments have delivered measurable benefits. But they’ve also resulted in some unintended side-effects: complexity and risk. ...
The taxi industry never saw Uber coming. Startups are a threat to incumbents like never before, and a major enabler for startups is that they are instantly “cloud ready.” If innovation moves at the pace of IT, then your company is in trouble. Why? Because your data center will not keep up with frenetic pace AWS, Microsoft and Google are rolling out new capabilities. In his session at 20th Cloud Expo, Don Browning, VP of Cloud Architecture at Turner, posited that disruption is inevitable for comp...
It is ironic, but perhaps not unexpected, that many organizations who want the benefits of using an Agile approach to deliver software use a waterfall approach to adopting Agile practices: they form plans, they set milestones, and they measure progress by how many teams they have engaged. Old habits die hard, but like most waterfall software projects, most waterfall-style Agile adoption efforts fail to produce the results desired. The problem is that to get the results they want, they have to ch...
"We are a monitoring company. We work with Salesforce, BBC, and quite a few other big logos. We basically provide monitoring for them, structure for their cloud services and we fit into the DevOps world" explained David Gildeh, Co-founder and CEO of Outlyer, in this SYS-CON.tv interview at DevOps Summit at 20th Cloud Expo, held June 6-8, 2017, at the Javits Center in New York City, NY.
In 2014, Amazon announced a new form of compute called Lambda. We didn't know it at the time, but this represented a fundamental shift in what we expect from cloud computing. Now, all of the major cloud computing vendors want to take part in this disruptive technology. In his session at 20th Cloud Expo, Doug Vanderweide, an instructor at Linux Academy, discussed why major players like AWS, Microsoft Azure, IBM Bluemix, and Google Cloud Platform are all trying to sidestep VMs and containers wit...
"When we talk about cloud without compromise what we're talking about is that when people think about 'I need the flexibility of the cloud' - it's the ability to create applications and run them in a cloud environment that's far more flexible,” explained Matthew Finnie, CTO of Interoute, in this SYS-CON.tv interview at 20th Cloud Expo, held June 6-8, 2017, at the Javits Center in New York City, NY.
The Internet giants are fully embracing AI. All the services they offer to their customers are aimed at drawing a map of the world with the data they get. The AIs from these companies are used to build disruptive approaches that cannot be used by established enterprises, which are threatened by these disruptions. However, most leaders underestimate the effect this will have on their businesses. In his session at 21st Cloud Expo, Rene Buest, Director Market Research & Technology Evangelism at Ara...
No hype cycles or predictions of zillions of things here. IoT is big. You get it. You know your business and have great ideas for a business transformation strategy. What comes next? Time to make it happen. In his session at @ThingsExpo, Jay Mason, Associate Partner at M&S Consulting, presented a step-by-step plan to develop your technology implementation strategy. He discussed the evaluation of communication standards and IoT messaging protocols, data analytics considerations, edge-to-cloud tec...
When growing capacity and power in the data center, the architectural trade-offs between server scale-up vs. scale-out continue to be debated. Both approaches are valid: scale-out adds multiple, smaller servers running in a distributed computing model, while scale-up adds fewer, more powerful servers that are capable of running larger workloads. It’s worth noting that there are additional, unique advantages that scale-up architectures offer. One big advantage is large memory and compute capacity...
New competitors, disruptive technologies, and growing expectations are pushing every business to both adopt and deliver new digital services. This ‘Digital Transformation’ demands rapid delivery and continuous iteration of new competitive services via multiple channels, which in turn demands new service delivery techniques – including DevOps. In this power panel at @DevOpsSummit 20th Cloud Expo, moderated by DevOps Conference Co-Chair Andi Mann, panelists examined how DevOps helps to meet the de...
"Loom is applying artificial intelligence and machine learning into the entire log analysis process, from start to finish and at the end you will get a human touch,” explained Sabo Taylor Diab, Vice President, Marketing at Loom Systems, in this SYS-CON.tv interview at 20th Cloud Expo, held June 6-8, 2017, at the Javits Center in New York City, NY.
A look across the tech landscape at the disruptive technologies that are increasing in prominence and speculate as to which will be most impactful for communications – namely, AI and Cloud Computing. In his session at 20th Cloud Expo, Curtis Peterson, VP of Operations at RingCentral, highlighted the current challenges of these transformative technologies and shared strategies for preparing your organization for these changes. This “view from the top” outlined the latest trends and developments i...
With major technology companies and startups seriously embracing Cloud strategies, now is the perfect time to attend 21st Cloud Expo October 31 - November 2, 2017, at the Santa Clara Convention Center, CA, and June 12-14, 2018, at the Javits Center in New York City, NY, and learn what is going on, contribute to the discussions, and ensure that your enterprise is on the right path to Digital Transformation.
@DevOpsSummit at Cloud Expo taking place Oct 31 - Nov 2, 2017, at the Santa Clara Convention Center, Santa Clara, CA, is co-located with the 21st International Cloud Expo and will feature technical sessions from a rock star conference faculty and the leading industry players in the world. The widespread success of cloud computing is driving the DevOps revolution in enterprise IT. Now as never before, development teams must communicate and collaborate in a dynamic, 24/7/365 environment. There is ...
The financial services market is one of the most data-driven industries in the world, yet it’s bogged down by legacy CPU technologies that simply can’t keep up with the task of querying and visualizing billions of records. In his session at 20th Cloud Expo, Karthik Lalithraj, a Principal Solutions Architect at Kinetica, discussed how the advent of advanced in-database analytics on the GPU makes it possible to run sophisticated data science workloads on the same database that is housing the rich...
What's the role of an IT self-service portal when you get to continuous delivery and Infrastructure as Code? This general session showed how to create the continuous delivery culture and eight accelerators for leading the change. Don Demcsak is a DevOps and Cloud Native Modernization Principal for Dell EMC based out of New Jersey. He is a former, long time, Microsoft Most Valuable Professional, specializing in building and architecting Application Delivery Pipelines for hybrid legacy, and cloud ...