|By Matthew McKenna||
|August 6, 2016 05:00 PM EDT||
The Blind Spot of the Security Industry: SSH User Keys
More than 95 percent of the world's enterprises rely on SSH user keys to provide administrators and developers an effective means of gaining encrypted access to critical infrastructure: operating systems, applications, payment processing systems, databases, human resource and financial systems, routers, switches, firewalls and other network devices. It is a lifeline of traffic flow within our data centers, our cloud environments and how our third-party vendors and supply chain access our environments. It has done its job quietly and efficiently over the last two decades. Unfortunately, the access that SSH has been providing, in particular the access SSH user keys provide, has gone largely unmanaged - to an epic degree.
There's an unfortunate parallel here to a recent Oscar-winning film based on the events that took down Wall Street in 2008: The Big Short. Iconoclastic investor Michael Burry discovered that the subprime home loans market was headed for default. He bet more than $1 billion against the housing industry - an unprecedented move. Sadly, for everyone but him and his investors, Burry was right.
Just as Wall Street had the blind spot when it came to subprime loans, the security industry has a blind spot regarding SSH protocol and SSH user keys. In fact, there are three parallels between the movie's plot and the current state of SSH management:
- Lack of understanding about the problem and its severity
- How challenging oversight of the problem is
- How significant the impact or consequences are of not addressing the problem
A Real and Far-Reaching Problem
The SSH protocol is not well understood by the majority of people, much like the subprime loan market. In fact, very few understand in sufficient detail the underlying critical access it is providing to our most important infrastructure. In this case, however, ignorance is not bliss.
For instance, in a typical financial enterprise with 20,000 Unix/Linux servers, one can expect to find up to 4 million SSH user keys providing interactive and machine-to-machine-based access. In many cases, 10 to 20 percent of these keys provide root-level access and cannot be associated to an owner within the enterprise. Root-level access is the highest level of privilege at an operating system level. It is not just a compliance and risk issue. It is an issue of resilience that has the opportunity to impact the potential downtime of critical services within operations.
Oversight and Governance Are Difficult or Non-Existent
Chiefly because SSH has long been seen as an encryption protocol rather than a means of access, this problem has gone unnoticed for a long time. As a result, it has not been considered as a part of the access governance processes and frameworks. In fact, up until October 2015, there were no NIST guidelines related to the best practices associated with SSH user key-based access.
HIPAA, SOX, PCI and other regulatory guidelines do mention access controls, such as least privilege and segregation of duties, but none of them specifically address SSH user keys as a form of access that needs to be controlled. Is this due to a lack of understanding of SSH or an unwillingness to open Pandora's Box?
In terms of the lack oversight and governance of SSH user key-based access, there are three technical aspects that must be considered. First, SSH user keys are the only form of access a user can provision themselves without oversight or control. Unlike passwords, an administrator, developer or third-party vendor can provision themselves with access to your critical infrastructure or automate a process or file transfer. This process of provisioning, de-provisioning and recertification of SSH user key-based access is infrequently if never addressed in the IAM frameworks of enterprises.
The second consideration is that SSH user keys don't have expiration dates, in contrast to certificates. This means that SSH user key-based access continues to exist, even after an application is decommissioned or a user leaves the company.
And the third consideration: an SSH user key does not need to be associated to a user account. This means a key will not necessarily establish the identity of the user associated with it. When an SSH user key is generated, the identity that associated is not connected.
Taken together, here's what these considerations mean. A user can provision SSH user key-based access themselves without oversight to my most critical infrastructure. This key-based access does not expire. It's not clear which identity it's associated with, and there are millions of these across an enterprise, which no one has an inventory of, and they are providing access to the most critical systems.
The Ramifications Are Significant
There is an additional cause for concern in addition to the lack of understanding of the technical aspects of the SSH protocol and the lack of regulatory oversight and governance of SSH user key-based access. SSH is the malicious actor's (either a rogue insider or hacker) tool of choice, and it is the primary means by which they move laterally within an enterprise to gain access to new assets and further elevate privilege, as well as how they exfiltrate data and assets from an organization. In fact, LightCyber's Cyberweapons 2016 Report indicates that in over 50 percent of the cases, the SSH protocol is the tool of choice to gain this lateral movement across the assets of an enterprise.
It's also important to factor in the reality that bad actors are aware that SSH user keys are not being managed. As a result, they go after private keys to gain access to assets. From there, they will often create new key pairs, which will generate them access to the outside directly or move those assets automatically to servers in the cloud. This is all achieved using something called "port forwarding" via "SSH tunneling," which would allow the malicious actor to extract this data via the encryption itself, rendering firewalls ineffective.
This kind of activity takes place right within our defensive structures. Why? Because we don't know who owns the keys to attain that encrypted access and because we are not able to look inside the encryption of those sessions. The potential impact is clear. We lose our data - or worse, our customer's data. We lose our intellectual property. We lose our reputation, we lose our brand and, in turn, we lose revenue and shareholder value.
The financial implications are clear here, but there's another implication that's at least as serious: operational resilience. The potential downtime in critical systems, should SSH user key-based access to those systems be compromised, is a concern we need to seriously consider. Our disaster recovery systems, which are failovers to our production systems, often share identical SSH user keys that ensure the processes of those systems fail over in an identical manner. This means that any compromised keys in production environments can also equally affect the failover to disaster recovery systems.
Clear Parallels Pointing to Danger
Let's circle back to the similarities between what's going on with SSH user key management and the events leading up to the housing crisis. The parallels are uncanny. There's a lack of market understanding around the power of the SSH protocol and the criticality of the access it provides to our infrastructures. There's a gap in regulatory and governing oversight of how this SSH user key-based access should be monitored, provisioned, de-provisioned and recertified. There's an encrypted protocol, which malicious actors use as their tool of choice to extend their access and reach within our enterprises, create backdoors and exfiltrate data. As a result, there's a potential financial, operational and brand impact that can conservatively be described as significant to our businesses.
Poor SSH user key management may not send all organizations using SSH into a rapid downward spiral that obliterates most of them, but it can still result in devastating damage to any organization that's ignored this issue.
With major technology companies and startups seriously embracing IoT strategies, now is the perfect time to attend @ThingsExpo 2016 in New York. Learn what is going on, contribute to the discussions, and ensure that your enterprise is as "IoT-Ready" as it can be! Internet of @ThingsExpo, taking place June 6-8, 2017, at the Javits Center in New York City, New York, is co-located with 20th Cloud Expo and will feature technical sessions from a rock star conference faculty and the leading industry p...
Jan. 16, 2017 01:45 PM EST Reads: 3,578
@DevOpsSummit at Cloud taking place June 6-8, 2017, at Javits Center, New York City, is co-located with the 20th International Cloud Expo and will feature technical sessions from a rock star conference faculty and the leading industry players in the world. The widespread success of cloud computing is driving the DevOps revolution in enterprise IT. Now as never before, development teams must communicate and collaborate in a dynamic, 24/7/365 environment. There is no time to wait for long developm...
Jan. 16, 2017 01:30 PM EST Reads: 3,292
Updating DevOps to the latest production data slows down your development cycle. Probably it is due to slow, inefficient conventional storage and associated copy data management practices. In his session at @DevOpsSummit at 20th Cloud Expo, Dhiraj Sehgal, in Product and Solution at Tintri, will talk about DevOps and cloud-focused storage to update hundreds of child VMs (different flavors) with updates from a master VM in minutes, saving hours or even days in each development cycle. He will also...
Jan. 16, 2017 01:00 PM EST Reads: 1,010
@DevOpsSummit taking place June 6-8, 2017 at Javits Center, New York City, is co-located with the 20th International Cloud Expo and will feature technical sessions from a rock star conference faculty and the leading industry players in the world. @DevOpsSummit at Cloud Expo New York Call for Papers is now open.
Jan. 16, 2017 12:30 PM EST Reads: 3,339
A look across the tech landscape at the disruptive technologies that are increasing in prominence and speculate as to which will be most impactful for communications – namely, AI and Cloud Computing. In his session at 20th Cloud Expo, Curtis Peterson, VP of Operations at RingCentral, will highlight the current challenges of these transformative technologies and share strategies for preparing your organization for these changes. This “view from the top” will outline the latest trends and developm...
Jan. 16, 2017 12:30 PM EST Reads: 808
"There's a growing demand from users for things to be faster. When you think about all the transactions or interactions users will have with your product and everything that is between those transactions and interactions - what drives us at Catchpoint Systems is the idea to measure that and to analyze it," explained Leo Vasiliou, Director of Web Performance Engineering at Catchpoint Systems, in this SYS-CON.tv interview at 18th Cloud Expo, held June 7-9, 2016, at the Javits Center in New York Ci...
Jan. 16, 2017 12:30 PM EST Reads: 5,440
The 20th International Cloud Expo has announced that its Call for Papers is open. Cloud Expo, to be held June 6-8, 2017, at the Javits Center in New York City, brings together Cloud Computing, Big Data, Internet of Things, DevOps, Containers, Microservices and WebRTC to one location. With cloud computing driving a higher percentage of enterprise IT budgets every year, it becomes increasingly important to plant your flag in this fast-expanding business opportunity. Submit your speaking proposal ...
Jan. 16, 2017 12:30 PM EST Reads: 4,990
Discover top technologies and tools all under one roof at April 24–28, 2017, at the Westin San Diego in San Diego, CA. Explore the Mobile Dev + Test and IoT Dev + Test Expo and enjoy all of these unique opportunities: The latest solutions, technologies, and tools in mobile or IoT software development and testing. Meet one-on-one with representatives from some of today's most innovative organizations
Jan. 16, 2017 12:00 PM EST Reads: 1,356
SYS-CON Events announced today that Dataloop.IO, an innovator in cloud IT-monitoring whose products help organizations save time and money, has been named “Bronze Sponsor” of SYS-CON's 20th International Cloud Expo®, which will take place on June 6-8, 2017, at the Javits Center in New York City, NY. Dataloop.IO is an emerging software company on the cutting edge of major IT-infrastructure trends including cloud computing and microservices. The company, founded in the UK but now based in San Fran...
Jan. 16, 2017 12:00 PM EST Reads: 2,386
20th Cloud Expo, taking place June 6-8, 2017, at the Javits Center in New York City, NY, will feature technical sessions from a rock star conference faculty and the leading industry players in the world. Cloud computing is now being embraced by a majority of enterprises of all sizes. Yesterday's debate about public vs. private has transformed into the reality of hybrid cloud: a recent survey shows that 74% of enterprises have a hybrid cloud strategy.
Jan. 16, 2017 11:30 AM EST Reads: 4,121
SYS-CON Events announced today that Super Micro Computer, Inc., a global leader in Embedded and IoT solutions, will exhibit at SYS-CON's 20th International Cloud Expo®, which will take place on June 7-9, 2017, at the Javits Center in New York City, NY. Supermicro (NASDAQ: SMCI), the leading innovator in high-performance, high-efficiency server technology, is a premier provider of advanced server Building Block Solutions® for Data Center, Cloud Computing, Enterprise IT, Hadoop/Big Data, HPC and E...
Jan. 16, 2017 11:30 AM EST Reads: 5,662
SYS-CON Events announced today that Linux Academy, the foremost online Linux and cloud training platform and community, will exhibit at SYS-CON's 20th International Cloud Expo®, which will take place on June 6-8, 2017, at the Javits Center in New York City, NY. Linux Academy was founded on the belief that providing high-quality, in-depth training should be available at an affordable price. Industry leaders in quality training, provided services, and student certification passes, its goal is to c...
Jan. 16, 2017 11:30 AM EST Reads: 1,847
IoT is at the core or many Digital Transformation initiatives with the goal of re-inventing a company's business model. We all agree that collecting relevant IoT data will result in massive amounts of data needing to be stored. However, with the rapid development of IoT devices and ongoing business model transformation, we are not able to predict the volume and growth of IoT data. And with the lack of IoT history, traditional methods of IT and infrastructure planning based on the past do not app...
Jan. 16, 2017 11:27 AM EST Reads: 233
The unique combination of Amazon Web Services and Cloud Raxak, a Gartner Cool Vendor in IT Automation, provides a seamless and cost-effective way of securely moving on-premise IT workloads to Amazon Web Services. Any enterprise can now leverage the cloud, manage risk, and maintain continuous security compliance. Forrester's analysis shows that enterprises need automated security to lower security risk and decrease IT operational costs. Through the seamless integration into Amazon Web Services, ...
Jan. 16, 2017 11:15 AM EST Reads: 1,714
Due of the rise of Hadoop, many enterprises are now deploying their first small clusters of 10 to 20 servers. At this small scale, the complexity of operating the cluster looks and feels like general data center servers. It is not until the clusters scale, as they inevitably do, when the pain caused by the exponential complexity becomes apparent. We've seen this problem occur time and time again. In his session at Big Data Expo, Greg Bruno, Vice President of Engineering and co-founder of StackIQ...
Jan. 16, 2017 10:15 AM EST Reads: 7,634
Containers have changed the mind of IT in DevOps. They enable developers to work with dev, test, stage and production environments identically. Containers provide the right abstraction for microservices and many cloud platforms have integrated them into deployment pipelines. DevOps and containers together help companies achieve their business goals faster and more effectively. In his session at DevOps Summit, Ruslan Synytsky, CEO and Co-founder of Jelastic, reviewed the current landscape of Dev...
Jan. 16, 2017 09:00 AM EST Reads: 3,927
WebRTC is the future of browser-to-browser communications, and continues to make inroads into the traditional, difficult, plug-in web communications world. The 6th WebRTC Summit continues our tradition of delivering the latest and greatest presentations within the world of WebRTC. Topics include voice calling, video chat, P2P file sharing, and use cases that have already leveraged the power and convenience of WebRTC.
Jan. 16, 2017 08:30 AM EST Reads: 3,014
"A lot of times people will come to us and have a very diverse set of requirements or very customized need and we'll help them to implement it in a fashion that you can't just buy off of the shelf," explained Nick Rose, CTO of Enzu, in this SYS-CON.tv interview at 18th Cloud Expo, held June 7-9, 2016, at the Javits Center in New York City, NY.
Jan. 16, 2017 06:30 AM EST Reads: 4,486
WebRTC sits at the intersection between VoIP and the Web. As such, it poses some interesting challenges for those developing services on top of it, but also for those who need to test and monitor these services. In his session at WebRTC Summit, Tsahi Levent-Levi, co-founder of testRTC, reviewed the various challenges posed by WebRTC when it comes to testing and monitoring and on ways to overcome them.
Jan. 16, 2017 06:30 AM EST Reads: 5,839
Every successful software product evolves from an idea to an enterprise system. Notably, the same way is passed by the product owner's company. In his session at 20th Cloud Expo, Oleg Lola, CEO of MobiDev, will provide a generalized overview of the evolution of a software product, the product owner, the needs that arise at various stages of this process, and the value brought by a software development partner to the product owner as a response to these needs.
Jan. 16, 2017 05:30 AM EST Reads: 1,086