Welcome!

@CloudExpo Authors: Elizabeth White, Carmen Gonzalez, Aruna Ravichandran, Liz McMillan, Patrick Hubbard

Related Topics: @CloudExpo

@CloudExpo: Article

SSH User Keys Now a C-Suite Issue | @CloudExpo #Cloud #BigData #Security

SSH is a reliable means of providing remote access securely for administrators and vendors

Why SSH User Keys Are Now a C-Suite Issue

With so much on their plates, most C-level executives don't have the bandwidth to think about SSH user keys. It would be much simpler to leave such details to the IT department - but that would be a mistake. Why? Because SSH means access - to an organization's servers and data. The creation of SSH user keys is not a controlled process, which means that someone can create credentials without oversight. That fact should be enough to get the C-suite's attention.

SSH is a reliable means of providing remote access securely for administrators and vendors and securely moving your data, in many cases, from point A to point B. SSH has been in your organization for the last 15 years, doing its job quietly and efficiently behind the scenes. It comes pre-installed at an operating system level on all Unix and Linux variants. It sits on 60 percent of the world's Web servers. It is the primary means of accessing and providing maintenance to servers as well as to your network devices, your routers, switches and firewalls - all the things that keep your business running.

Who's Holding the Keys?
It may seem strange - and even reckless - that such a powerful protocol has never been controlled. But there are several reasons for this. For one thing, in the majority of companies today, SSH is not centrally owned or governed by a particular group within the business. It has historically sat in the underworld of Unix administrators and local application owners for the last decade. Access has been provisioned on an ad hoc basis. Administrators are not bad guys; they are tasked with keeping your systems up and running in the middle of the night.  They want to do their jobs quickly and efficiently.

A second reason is that, until the release of the NIST-IR 7966, "the best practices on interactive and automated access for SSH key-based access," there was no authoritative guidance on the topic. The major consulting houses are just starting to develop practices to help advise their customers on how to approach and manage the risk.

And third, this form of access has not been something that identity access management teams have seen in their remit to control. SSH keys are slowly but surely finding their way into projects related to privileged access management but are mostly being looked at from the perspective of "flesh and blood" users of the keys. Unfortunately, upwards of 80 percent of the SSH-related access is for machine-to-machine connections.  On the other side of the coin, cryptography teams have only been asked to look at certificates and encryption keys.  SSH user keys have traditionally not been in their remit, either.

The SSH protocol is not well understood among executives, even those at global corporations, so they are not aware that it is one of the largest uncontrolled access risks to their businesses.  As they become educated about SSH, they raise a very logical question, "I have never heard of a breach related to SSH or SSH user keys - why should I do something about this now?"

The answer to that question is another question: How could you know if you've had an SSH-based breach? The majority of companies today do not have inventories of the SSH user keys their administrators have created over the years.  These key-based credentials are not monitored. You don't know where they are being used from, how frequently or why. You are not able to associate them back to owners in your environment.  The fact is, you wouldn't know if a malicious actor had access to SSH user keys in your environment. You wouldn't know if they were misusing this encryption to exfiltrate data out of your organizations.  In essence, it is the perfect attack vector for malicious actors.

SSH user keys provide unlimited access to an organization's trading systems, payment processing systems, databases where credit card information or patient data is kept, routers and switches which keep information flowing efficiently, file transfer systems which move that data. Imagine uncontrolled access to everything that keeps your business up and running. This is not only a question of risk and compliance; it is a question of resilience.

This Could Be You
Imagine: Your organization has just hired a contractor who is going to help with some IT-related work. They are put through an approval process with the human resource team and approved by the business owners whom they will work for. You onboard them into your identity access management system. They are provisioned a user name and password, given a computer and are able to begin their work.

As the contractor gets to work, he or she begins to provision SSH user keys and gain access to test machines and maybe even leave some keys behind on production machines.  Then their contracting period comes to an end. They are removed from your identity access management systems, their user name and passwords are revoked, their computer collected.  A job well done. The system worked. Unfortunately, it didn't. You forgot to revoke their key-based access. And unfortunately, you can't because you never tracked or controlled the provisioning of this critical access. The contractor who has just left your organization still has access!

Taking Control of Your Keys
Organizations can't afford to treat SSH user key-based access lightly, hoping that upper-level executives or auditors don't find out about it. To give a sense of the size of the issue: in a banking environment with 30,000 Unix/Linux servers, one can expect to find up to 4.5 million SSH user keys for application-to-application connections and 450,000 SSH user keys for system administrator- or database administrator-related access. That's 4.5 million potential points of access to your most critical systems that you don't have a strong degree of control over.

With an understanding of how critical SSH keys are, you need to learn from those tasked with encryption, infrastructure and access management. Find out where responsibility lies for controlling SSH user key-based access in your organization, as well as for provisioning and de-provisioning such access. Is there an access control policy in place, and can you enforce it? Getting clear answers to these questions will help you secure access to your organization's treasure chest of data.

More Stories By Matthew McKenna

Matthew McKenna is Chief Strategy Officer and vice president of Key Accounts at SSH Communications Security. He brings over 15 years of high technology sales, marketing and management experience to SSH Communications Security and drives strategy, key account sales and evangelism. His expertise in strategically delivering technology solutions that anticipate the marketplace has helped the company become a market leader.

Prior to joining the company, Matthew served as a member of the executive management team of ADP Dealer Services Nordic and Automaster Oy, where he was responsible for international channel operations and manufacturer relations. In addition, he was responsible for key accounts including Mercedes Benz, General Motors, and Scania CV. Before this, he played professional soccer in Germany and Finland.

Matthew holds a Bachelor of Arts degree in German from the University of South Carolina and an MBA from the Helsinki School of Economics and Business Administration.

Comments (0)

Share your thoughts on this story.

Add your comment
You must be signed in to add a comment. Sign-in | Register

In accordance with our Comment Policy, we encourage comments that are on topic, relevant and to-the-point. We will remove comments that include profanity, personal attacks, racial slurs, threats of violence, or other inappropriate material that violates our Terms and Conditions, and will block users who make repeated violations. We ask all readers to expect diversity of opinion and to treat one another with dignity and respect.


@CloudExpo Stories
"We're bringing out a new application monitoring system to the DevOps space. It manages large enterprise applications that are distributed throughout a node in many enterprises and we manage them as one collective," explained Kevin Barnes, President of eCube Systems, in this SYS-CON.tv interview at DevOps at 18th Cloud Expo, held June 7-9, 2016, at the Javits Center in New York City, NY.
With major technology companies and startups seriously embracing IoT strategies, now is the perfect time to attend @ThingsExpo 2016 in New York. Learn what is going on, contribute to the discussions, and ensure that your enterprise is as "IoT-Ready" as it can be! Internet of @ThingsExpo, taking place June 6-8, 2017, at the Javits Center in New York City, New York, is co-located with 20th Cloud Expo and will feature technical sessions from a rock star conference faculty and the leading industry p...
@DevOpsSummit at Cloud taking place June 6-8, 2017, at Javits Center, New York City, is co-located with the 20th International Cloud Expo and will feature technical sessions from a rock star conference faculty and the leading industry players in the world. The widespread success of cloud computing is driving the DevOps revolution in enterprise IT. Now as never before, development teams must communicate and collaborate in a dynamic, 24/7/365 environment. There is no time to wait for long developm...
Updating DevOps to the latest production data slows down your development cycle. Probably it is due to slow, inefficient conventional storage and associated copy data management practices. In his session at @DevOpsSummit at 20th Cloud Expo, Dhiraj Sehgal, in Product and Solution at Tintri, will talk about DevOps and cloud-focused storage to update hundreds of child VMs (different flavors) with updates from a master VM in minutes, saving hours or even days in each development cycle. He will also...
"There's a growing demand from users for things to be faster. When you think about all the transactions or interactions users will have with your product and everything that is between those transactions and interactions - what drives us at Catchpoint Systems is the idea to measure that and to analyze it," explained Leo Vasiliou, Director of Web Performance Engineering at Catchpoint Systems, in this SYS-CON.tv interview at 18th Cloud Expo, held June 7-9, 2016, at the Javits Center in New York Ci...
The 20th International Cloud Expo has announced that its Call for Papers is open. Cloud Expo, to be held June 6-8, 2017, at the Javits Center in New York City, brings together Cloud Computing, Big Data, Internet of Things, DevOps, Containers, Microservices and WebRTC to one location. With cloud computing driving a higher percentage of enterprise IT budgets every year, it becomes increasingly important to plant your flag in this fast-expanding business opportunity. Submit your speaking proposal ...
@DevOpsSummit taking place June 6-8, 2017 at Javits Center, New York City, is co-located with the 20th International Cloud Expo and will feature technical sessions from a rock star conference faculty and the leading industry players in the world. @DevOpsSummit at Cloud Expo New York Call for Papers is now open.
A look across the tech landscape at the disruptive technologies that are increasing in prominence and speculate as to which will be most impactful for communications – namely, AI and Cloud Computing. In his session at 20th Cloud Expo, Curtis Peterson, VP of Operations at RingCentral, will highlight the current challenges of these transformative technologies and share strategies for preparing your organization for these changes. This “view from the top” will outline the latest trends and developm...
Discover top technologies and tools all under one roof at April 24–28, 2017, at the Westin San Diego in San Diego, CA. Explore the Mobile Dev + Test and IoT Dev + Test Expo and enjoy all of these unique opportunities: The latest solutions, technologies, and tools in mobile or IoT software development and testing. Meet one-on-one with representatives from some of today's most innovative organizations
SYS-CON Events announced today that Dataloop.IO, an innovator in cloud IT-monitoring whose products help organizations save time and money, has been named “Bronze Sponsor” of SYS-CON's 20th International Cloud Expo®, which will take place on June 6-8, 2017, at the Javits Center in New York City, NY. Dataloop.IO is an emerging software company on the cutting edge of major IT-infrastructure trends including cloud computing and microservices. The company, founded in the UK but now based in San Fran...
SYS-CON Events announced today that Linux Academy, the foremost online Linux and cloud training platform and community, will exhibit at SYS-CON's 20th International Cloud Expo®, which will take place on June 6-8, 2017, at the Javits Center in New York City, NY. Linux Academy was founded on the belief that providing high-quality, in-depth training should be available at an affordable price. Industry leaders in quality training, provided services, and student certification passes, its goal is to c...
20th Cloud Expo, taking place June 6-8, 2017, at the Javits Center in New York City, NY, will feature technical sessions from a rock star conference faculty and the leading industry players in the world. Cloud computing is now being embraced by a majority of enterprises of all sizes. Yesterday's debate about public vs. private has transformed into the reality of hybrid cloud: a recent survey shows that 74% of enterprises have a hybrid cloud strategy.
SYS-CON Events announced today that Super Micro Computer, Inc., a global leader in Embedded and IoT solutions, will exhibit at SYS-CON's 20th International Cloud Expo®, which will take place on June 7-9, 2017, at the Javits Center in New York City, NY. Supermicro (NASDAQ: SMCI), the leading innovator in high-performance, high-efficiency server technology, is a premier provider of advanced server Building Block Solutions® for Data Center, Cloud Computing, Enterprise IT, Hadoop/Big Data, HPC and E...
IoT is at the core or many Digital Transformation initiatives with the goal of re-inventing a company's business model. We all agree that collecting relevant IoT data will result in massive amounts of data needing to be stored. However, with the rapid development of IoT devices and ongoing business model transformation, we are not able to predict the volume and growth of IoT data. And with the lack of IoT history, traditional methods of IT and infrastructure planning based on the past do not app...
The unique combination of Amazon Web Services and Cloud Raxak, a Gartner Cool Vendor in IT Automation, provides a seamless and cost-effective way of securely moving on-premise IT workloads to Amazon Web Services. Any enterprise can now leverage the cloud, manage risk, and maintain continuous security compliance. Forrester's analysis shows that enterprises need automated security to lower security risk and decrease IT operational costs. Through the seamless integration into Amazon Web Services, ...
Due of the rise of Hadoop, many enterprises are now deploying their first small clusters of 10 to 20 servers. At this small scale, the complexity of operating the cluster looks and feels like general data center servers. It is not until the clusters scale, as they inevitably do, when the pain caused by the exponential complexity becomes apparent. We've seen this problem occur time and time again. In his session at Big Data Expo, Greg Bruno, Vice President of Engineering and co-founder of StackIQ...
Containers have changed the mind of IT in DevOps. They enable developers to work with dev, test, stage and production environments identically. Containers provide the right abstraction for microservices and many cloud platforms have integrated them into deployment pipelines. DevOps and containers together help companies achieve their business goals faster and more effectively. In his session at DevOps Summit, Ruslan Synytsky, CEO and Co-founder of Jelastic, reviewed the current landscape of Dev...
WebRTC is the future of browser-to-browser communications, and continues to make inroads into the traditional, difficult, plug-in web communications world. The 6th WebRTC Summit continues our tradition of delivering the latest and greatest presentations within the world of WebRTC. Topics include voice calling, video chat, P2P file sharing, and use cases that have already leveraged the power and convenience of WebRTC.
WebRTC sits at the intersection between VoIP and the Web. As such, it poses some interesting challenges for those developing services on top of it, but also for those who need to test and monitor these services. In his session at WebRTC Summit, Tsahi Levent-Levi, co-founder of testRTC, reviewed the various challenges posed by WebRTC when it comes to testing and monitoring and on ways to overcome them.
"A lot of times people will come to us and have a very diverse set of requirements or very customized need and we'll help them to implement it in a fashion that you can't just buy off of the shelf," explained Nick Rose, CTO of Enzu, in this SYS-CON.tv interview at 18th Cloud Expo, held June 7-9, 2016, at the Javits Center in New York City, NY.