Welcome!

@CloudExpo Authors: Jason Bloomberg, Pat Romanski, Liz McMillan, Kevin Benedict, Elizabeth White

Related Topics: @CloudExpo, Microservices Expo, Containers Expo Blog, Cloud Security, @BigDataExpo, SDN Journal

@CloudExpo: Article

Brass Tacks: Answering the Cloud Security Questions That Matter

Who is logging in? What is accessed? When was it changed? And more!

Enterprise security can be a labyrinthine, complex beast with many moving parts, dozen upon dozen of requirements, needs, implications, options and alternatives.

But when we get down to the nitty gritty (the brass tacks if you will), cloud security can be simplified by six simple questions:

WHO is logging in?

WHAT are they accessing/viewing?

WHERE is the device from which that person logs in?

WHEN was any asset changed/modified/moved

HOW are they authorized/credentialed?

WHAT is the impact of the event?

Now determining the answers to those questions might require a bit of coordination, but in terms of initiative and priority, it is the answers to the above questions that must drive any enterprise security initiative.

The concept of enterprise security is simple. Allow those who you want to see and access data in, and everyone else out. Of course the addendum to that is those that are on the “inside” cannot /should not distribute information outside the “Circle of Trust.” Now note I said the concept is simple. The application of those concepts is much more complex. It requires the coordination and integration of several features from several different solutions. If you are a CFO, I can see you rolling your eyes…how much is this going to cost (the answer is not as much as you think if deploying cloud-based solutions)?! If you are a CTO, I can hear that you’ve got it covered. If so, great, but there are thousands upon thousands of companies who don’t or only have a portion of the equation adequately addressed. And the biggest issue with the latter is that too many professionals mistakenly think their portion is doing the whole job.

In journalism class, I learned that the 5W’s and an H (who, what, when, where, why and how) are what every news story needs. Extrapolated, it’s what every enterprise security initiative needs as well.  If you can answer these questions, then you’re definitely on the right path to securing your assets as well as the privacy of your users. Yet, the key to answering these questions is having certain solutions and policies in place. Beyond that, you need those access management solutions and policies to “talk” to each other. It’s no longer good enough to have user provisioning if it doesn’t communicate with intrusion detection (SIEM/Log Management) or single sign on (access management). That brings us to our first question:

Who is logging in?: You can implement all industry standards, follow all regulatory frameworks and best practices, but if you're sloppy from the start with verifying identities, maintaining credentials or parsing the data,  then your security is no better than blind man’s bluff. Let’s say Ms. Jones leaves your employ. If you don’t have user provisioning, it may take days or weeks (if ever) to remove her permissions from network access protocols. Without intrusion detection, you’ll not know if she's trying to access her account after her employment relationship ends. But assume she's an honest person and that your company de-provisioned her accounts. It still doesn’t mean someone isn’t trying to breach using her credentials. How will you know? If an alert policy is set to notify someone when an attempt for access is tried against a dormant or retired account (or three failed log-in tries), you can immediately trace it to the IP source and remediate the issue.  Thing of it is, if you think whoever is attempting a breach is limited to just Ms. Jones’ former account, you’re sadly mistaken. By integrating single sign on, identity management and SIEM, you get to know exactly who is logging in (with a record for compliance) and those without the proper credentials are left on the outside looking in.

What are they accessing? Because a user provides the proper user name and password combo, shouldn’t give them keys to the entire kingdom. Think of what would happen if all the currency plates or gold reserves were held in the same building. Anyone with a key and a wheelbarrow could be wealthy beyond the dreams of avarice. The same needs to be considered for your IT assets, your proprietary sensitive data and personal information. What files, data and applications are necessary for a person to do their or job, place an order securely? Multi-authenticated access opens the door, but it is role-based provisioning that limits what they can see based on the needs of the person and the permission policies ascribed to them. Does shipping & receiving need human resource applications to send Product X to a customer?  In this scenario, integrating single sign on with identity management creates the necessary automation, but also creates the compliance information which is automatically compiled for reporting. That’s something I call Enterprise Access Control. The next step is ensuring the data is properly encrypted…but that‘s a blog for another day.

Where is the device? You want to know that the activity is coming from validated IP addresses. For example…Carl is your salesman and spent the night in a hotel in Wichita. He logged in and uploaded some expense reports. No problem. He had the right credentials. He accessed portions on the ERP that are based on his role permissions. Three hours later, Carl’s account is accessed again; had the right user name and password, but this time the SIEM system picks up that the IP address is coming from Romania. Well unless Carl has some superhuman power or used a Star Trek transporter, it isn’t him. With the significant rise in BYOD (tablets, iPhones, laptops, etc…), you need to make sure you know where the access is coming from. Without profiling, you know that certain areas of the globe are more likely to be engaged in hacking.

When was the asset changed? Much like “where,” when certain applications or data are accessed provides clues as to whether an event is suspicious or benign. By applying adaptive risk policies (predictive behavior protocols), you can plot when certain IP addresses log on.  Does it raise any red flags that attempts on certain accounts or applications are happening at 3 in the morning? In it of itself, probably not. But if you apply context from various data deposits, the picture becomes much clearer. If you are in Chicago, 3am is 9am in London. If you have customers or partners in London, there’s probably no cause for alarm. However, if the log on is to an application that a partner would have no need of, then the time of the attempt is very relevant. However, to correlate the appropriate context, your security functions need to be centralized and collaborating 24/7/365.

How are they credentialed? I’ve addressed roles-based user provisioning, application access restriction, but the last part of the policy is ensuring that multi-factored authentication is applied. In short, the best practice is to create security questions that cannot be easily found in any other digital forum (like birthdays, hire dates, names of family). The other half is to insist on strong passwords that include letters, numbers and symbols. Recognizing that a single password has a certain likelihood of being cracked, authenticating a user on two of these factors makes it one step harder for someone to exploit your system. Multi-factor authentication will make it substantially more difficult for an attacker to gain access but it’s certainly not going to stop a determined attacker. This is again why I insist on a layered and integrated security infrastructure.

What is the impact? Not all events require your IT SWAT team. Vulnerability scans using situational context allow you to separate the white noise and the everyday routine from suspicious events. Again it is the flow of data from various silos and applications that make it more security more effective and allow you to determine the threat level of any particular cluster of activity.

But the reality is that answering all the questions takes investment. Beyond dollars, it takes investment of time, resources, expertise and infrastructure. The overarching problem is not every company has that luxury or an excess of expenditures to apply against something that is looked upon as a cost center. If for no other reason than that, companies must look towards the cloud as a provider of security solutions. Cloud-based security can be seen as the great equalizer. Whether it is adding new capabilities or leveraging what you already have in house. It centralizes all the tools, all the expertise as a holistic security-as-a-service that broadens the reach and scope of enterprise monitoring, strengthens access authentication, and satisfies regulatory compliance. Because of the cost and resource savings It changes the dynamic for an initiative to performance rather than limited scope.

To answer all the questions posed above, the most important element a security initiative must have is visibility. Without coordinating/correlating the data acquired from independent security initiatives, you are wasting resources, potentially duplicating efforts and not necessarily seeing the whole picture.

If the application of security were simple, then everybody would recognize Superman was Clark Kent. Are you trying to tell me that a simple pair of glasses can fool everybody? So it really is all about context. You don’t expect to see Superman as Clark Kent, so you don’t. However if you centrally put all the information in context, then you would see the truth. That is why it remains ever so important to continue to ask the questions of your security initiative…who, what, where, when and how?

Kevin Nikkhoo

CloudAccess

More Stories By Kevin Nikkhoo

With more than 32 years of experience in information technology, and an extensive and successful entrepreneurial background, Kevin Nikkhoo is the CEO of the dynamic security-as-a-service startup Cloud Access. CloudAccess is at the forefront of the latest evolution of IT asset protection--the cloud.

Kevin holds a Bachelor of Science in Computer Engineering from McGill University, Master of Computer Engineering at California State University, Los Angeles, and an MBA from the University of Southern California with emphasis in entrepreneurial studies.

@CloudExpo Stories
SYS-CON Events announced today that Datera will exhibit at SYS-CON's 21st International Cloud Expo®, which will take place on Oct 31 – Nov 2, 2017, at the Santa Clara Convention Center in Santa Clara, CA. Datera offers a radically new approach to data management, where innovative software makes data infrastructure invisible, elastic and able to perform at the highest level. It eliminates hardware lock-in and gives IT organizations the choice to source x86 server nodes, with business model option...
DevOps at Cloud Expo, taking place October 31 - November 2, 2017, at the Santa Clara Convention Center in Santa Clara, CA, is co-located with 21st Cloud Expo and will feature technical sessions from a rock star conference faculty and the leading industry players in the world. The widespread success of cloud computing is driving the DevOps revolution in enterprise IT. Now as never before, development teams must communicate and collaborate in a dynamic, 24/7/365 environment. There is no time to w...
21st International Cloud Expo, taking place October 31 - November 2, 2017, at the Santa Clara Convention Center in Santa Clara, CA, will feature technical sessions from a rock star conference faculty and the leading industry players in the world. Cloud computing is now being embraced by a majority of enterprises of all sizes. Yesterday's debate about public vs. private has transformed into the reality of hybrid cloud: a recent survey shows that 74% of enterprises have a hybrid cloud strategy. Me...
SYS-CON Events announced today that DXWorldExpo has been named “Global Sponsor” of SYS-CON's 21st International Cloud Expo, which will take place on Oct 31 – Nov 2, 2017, at the Santa Clara Convention Center in Santa Clara, CA. Digital Transformation is the key issue driving the global enterprise IT business. Digital Transformation is most prominent among Global 2000 enterprises and government institutions.
SYS-CON Events announced today that Massive Networks will exhibit at SYS-CON's 21st International Cloud Expo®, which will take place on Oct 31 – Nov 2, 2017, at the Santa Clara Convention Center in Santa Clara, CA. Massive Networks mission is simple. To help your business operate seamlessly with fast, reliable, and secure internet and network solutions. Improve your customer's experience with outstanding connections to your cloud.
When shopping for a new data processing platform for IoT solutions, many development teams want to be able to test-drive options before making a choice. Yet when evaluating an IoT solution, it’s simply not feasible to do so at scale with physical devices. Building a sensor simulator is the next best choice; however, generating a realistic simulation at very high TPS with ease of configurability is a formidable challenge. When dealing with multiple application or transport protocols, you would be...
SYS-CON Events announced today that Akvelon will exhibit at SYS-CON's 21st International Cloud Expo®, which will take place on Oct 31 – Nov 2, 2017, at the Santa Clara Convention Center in Santa Clara, CA. Akvelon is a business and technology consulting firm that specializes in applying cutting-edge technology to problems in fields as diverse as mobile technology, sports technology, finance, and healthcare.
Connecting to major cloud service providers is becoming central to doing business. But your cloud provider’s performance is only as good as your connectivity solution. Massive Networks will place you in the driver's seat by exposing how you can extend your LAN from any location to include any cloud platform through an advanced high-performance connection that is secure and dedicated to your business-critical data. In his session at 21st Cloud Expo, Paul Mako, CEO & CIO of Massive Networks, wil...
In his session at 20th Cloud Expo, Scott Davis, CTO of Embotics, discussed how automation can provide the dynamic management required to cost-effectively deliver microservices and container solutions at scale. He also discussed how flexible automation is the key to effectively bridging and seamlessly coordinating both IT and developer needs for component orchestration across disparate clouds – an increasingly important requirement at today’s multi-cloud enterprise.
As more and more companies are making the shift from on-premises to public cloud, the standard approach to DevOps is evolving. From encryption, compliance and regulations like GDPR, security in the cloud has become a hot topic. Many DevOps-focused companies have hired dedicated staff to fulfill these requirements, often creating further siloes, complexity and cost. This session aims to highlight existing DevOps cultural approaches, tooling and how security can be wrapped in every facet of the bu...
For financial firms, the cloud is going to increasingly become a crucial part of dealing with customers over the next five years and beyond, particularly with the growing use and acceptance of virtual currencies. There are new data storage paradigms on the horizon that will deliver secure solutions for storing and moving sensitive financial data around the world without touching terrestrial networks. In his session at 20th Cloud Expo, Cliff Beek, President of Cloud Constellation Corporation, d...
IT organizations are moving to the cloud in hopes to approve efficiency, increase agility and save money. Migrating workloads might seem like a simple task, but what many businesses don’t realize is that application migration criteria differs across organizations, making it difficult for architects to arrive at an accurate TCO number. In his session at 21st Cloud Expo, Joe Kinsella, CTO of CloudHealth Technologies, will offer a systematic approach to understanding the TCO of a cloud application...
"With Digital Experience Monitoring what used to be a simple visit to a web page has exploded into app on phones, data from social media feeds, competitive benchmarking - these are all components that are only available because of some type of digital asset," explained Leo Vasiliou, Director of Web Performance Engineering at Catchpoint Systems, in this SYS-CON.tv interview at DevOps Summit at 20th Cloud Expo, held June 6-8, 2017, at the Javits Center in New York City, NY.
SYS-CON Events announced today that Secure Channels, a cybersecurity firm, will exhibit at SYS-CON's 21st International Cloud Expo®, which will take place on Oct 31 – Nov 2, 2017, at the Santa Clara Convention Center in Santa Clara, CA. Secure Channels, Inc. offers several products and solutions to its many clients, helping them protect critical data from being compromised and access to computer networks from the unauthorized. The company develops comprehensive data encryption security strategie...
SYS-CON Events announced today that WineSOFT will exhibit at SYS-CON's 21st International Cloud Expo®, which will take place on Oct 31 – Nov 2, 2017, at the Santa Clara Convention Center in Santa Clara, CA. Based in Seoul and Irvine, WineSOFT is an innovative software house focusing on internet infrastructure solutions. The venture started as a bootstrap start-up in 2010 by focusing on making the internet faster and more powerful. WineSOFT’s knowledge is based on the expertise of TCP/IP, VPN, SS...
WebRTC is the future of browser-to-browser communications, and continues to make inroads into the traditional, difficult, plug-in web communications world. The 6th WebRTC Summit continues our tradition of delivering the latest and greatest presentations within the world of WebRTC. Topics include voice calling, video chat, P2P file sharing, and use cases that have already leveraged the power and convenience of WebRTC.
SYS-CON Events announced today that App2Cloud will exhibit at SYS-CON's 21st International Cloud Expo®, which will take place on Oct. 31 – Nov 2, 2017, at the Santa Clara Convention Center in Santa Clara, CA. App2Cloud is an online Platform, specializing in migrating legacy applications to any Cloud Providers (AWS, Azure, Google Cloud).
The goal of Continuous Testing is to shift testing left to find defects earlier and release software faster. This can be achieved by integrating a set of open source functional and performance testing tools in the early stages of your software delivery lifecycle. There is one process that binds all application delivery stages together into one well-orchestrated machine: Continuous Testing. Continuous Testing is the conveyer belt between the Software Factory and production stages. Artifacts are m...
When shopping for a new data processing platform for IoT solutions, many development teams want to be able to test-drive options before making a choice. Yet when evaluating an IoT solution, it’s simply not feasible to do so at scale with physical devices. Building a sensor simulator is the next best choice; however, generating a realistic simulation at very high TPS with ease of configurability is a formidable challenge. When dealing with multiple application or transport protocols, you would be...
Cloud resources, although available in abundance, are inherently volatile. For transactional computing, like ERP and most enterprise software, this is a challenge as transactional integrity and data fidelity is paramount – making it a challenge to create cloud native applications while relying on RDBMS. In his session at 21st Cloud Expo, Claus Jepsen, Chief Architect and Head of Innovation Labs at Unit4, will explore that in order to create distributed and scalable solutions ensuring high availa...