Welcome!

@CloudExpo Authors: Elizabeth White, Liz McMillan, Pat Romanski, Astadia CloudGPS, Jason Bloomberg

Related Topics: @CloudExpo, Cloud Security

@CloudExpo: Article

A Disturbance in the Force

Can customers prepare for the coming round of protocol enhancements?

The Internet is quietly being replumbed. That shouldn't surprise anyone involved with it; the Internet is always being replumbed. But you might be more surprised to learn that the next few years will bring an unusual burst of changes in that plumbing, some with great potential consequences for anyone who relies on the Net.

By "plumbing," I of course refer to the protocols and software that make the core features of the Internet work. These have been evolving steadily since 1969, but I don't think any period since the early 1980s has seen as many changes as we'll see over the next few years.

Like anything new, these changes will bring both threats and opportunities - but in this case, probably more threats than opportunities. Each critical part of your organization's infrastructure is potentially at risk from any fundamental change, and there will be several such changes in succession.

The Next Big Things
DNSSEC

For years, experts have warned that the Domain Name System, one of the most important subsystems on the Internet, is at severe risk from malicious actors. All sorts of schemes are possible if you can hijack someone else's domain name, and there are many ways to accomplish that hijacking. DNSSEC makes domain hijacking much, much harder, and therefore makes it more reasonable to trust the identities of Internet sites. It is the foundation for a more trusted net.

After years of work, a milestone was reached in 2010 when the root domain was signed with DNSSEC. Over the next few years, more and more sites will try to protect their identities and reputations with DNSSEC. The potential for breaking older or unusual DNS implementations can't be ignored, but any organization that has a lot invested in its domain name should consider using DNSSEC to protect it from hijacking and to reassure end users.

IPv6
The TCP/IP protocols were designed to facilitate what almost everyone thought was an absurdly big network - over 4 billion computers. Less than 30 years later, we all know (as I said in 1983, mostly to dismissive laughter) that the 4 billion addresses enabled by IPv4 are simply not enough. To keep the Net from fragmenting, to facilitate universal communication, and to avoid having the Net's growth stop dead in its tracks, it is essential that the world convert to IPv6.

Adoption of IPv6 has been slow, but there's a good reason to expect that to change: halfway through 2011, the supply of IPv4 addresses will simply run out. There are all sorts of half-measures and hacks that can postpone things a bit further, but by now it's clear that the future of the Internet requires IPv6. Despite the many person-centuries of work that have gone into IPv6, the transition is highly unlikely to be smooth and painless for everyone.

International Email Addresses
For as long as there has been Internet email, addresses have been limited to the ASCII character set. Spanish speakers can't use the letter "ñ" even if it's part of their name, and Germans similarly have to do without their "ö." They've been remarkably patient with what is, from their perspective, a gross inadequacy in the email standards. But the people who have it worst are Asians, as their characters are forbidden in traditional email addresses. What the world wants are email addresses like these:

After many years of wishing, arguing and working, the IETF is closing in on a solution. Internationalized domain names (the right-hand side of the email address) have been a reality for a little while now, and the IETF has been tackling the final bit, the left hand side. This turns out to be much, much, much harder than it sounds, because of the problem of backward-compatibility with the old standards and all the old mailers in the world.

The solution is going to be ugly, but functional. New encodings map ugly strings like "xn-bcher-kva.ch" onto desired internationalized forms such as "Bücher.ch." Ideally, users will never see the ugly forms, which are designed to be backwards-compatible, but inevitably they sometimes will. Worse still, it may be impossible for a user of older software to reply to email from someone with an internationalized address.

The bottom line: we'll be going through a period during which email will probably not be quite as universal, or as stable, as we're accustomed to it being. Anyone with responsibility for software that processes email addresses will need to make sure that their software doesn't do horrible things when these new forms of addresses are encountered.

DKIM
The fight against spam will never end, because the miracle of Moore's Law - the same miracle that gives us ever smaller and more powerful computing devices - operates in favor of the spammers. Every time we get twice as good at detecting spam, spammers are able to generate twice as much spam for the same price, which means that the good guys are running on a treadmill, needing to work continuously just to avoid falling behind.

One manifestation of that hard work is the DKIM standard, which stands for "Domain Keys Identified Mail." This specifies a procedure by which organizations can publish cryptographic keys and sign all its outgoing mail, thus making it somewhat easier to be sure where some messages really originate. It's far from a cure-all, but it has the potential - particularly when paired with as-yet-undefined reputation systems - to make it easier to detect spam with forged sender information, the issue at the heart of the "phishing" problem.

DKIM has been in development for several years now, and is progressing well through the standards process. It should be mostly invisible to end users, but will keep mail system administrators busy for a while. As they learn to configure outgoing mail for signatures and to check incoming mail for signatures, there is a strong potential for destabilizing the email environment in general. The most likely issue will be mail that just doesn't reach its intended recipient. That's a much higher risk during the period that DKIM - or really, any other anti-spam standards and technologies - are being newly deployed.

Reputation Services
High on nearly everyone's list, in the wake of technologies such as DKIM, are reputation services - trusted parties that can tell you if a message is signed as being from Joe.com, whether or not Joe.com is known for sending spam or other bad things over the Internet.

Although there are no standards for reputation services yet - and although they are undeniably needed - we can already see the risks and benefits by looking at the non-standardized reputation services in use today, notably blacklists of email senders. Although these are incredibly useful, there is a never-ending stream of problems with organizations that get added to such lists inappropriately, and the administrative difficulties of getting them removed promptly.

Similar considerations will surely apply to the standardized reputation services of the future - no such service can be any better than the support organization that deals with exceptions and problems. Any progress with reputation standards should be expected to be accompanied by transitional pains as the reputation service bureaus mature and develop good or bad reputations themselves.

What Can Customers Do?
Make no mistake, the coming improvements to the Internet's plumbing are a very good thing. But the implementation of each of them brings with it the potential for destabilizing various aspects of the Internet infrastructure, despite the heroic efforts of the IETF to minimize that risk.

Vendors can increase or reduce the risk through their quality of implementation. What can customers do?

Paradoxically, the answer is to do more by doing less. The biggest risks are inevitably found in the least professionally administered software and servers. The big cloud providers with the staff of crack programmers and administrators are at the least risk, because they understand the risks well enough to take steps far in advance. But that specialized application that your predecessor commissioned 10 years ago, and is now running more or less autonomously on an ancient server in your headquarters, could represent a huge risk.

Basically, the risk is highest where the least attention is being paid. The best thing that most organizations can do, in preparation for the coming instabilities, is to use fear of the unknown as an excuse to clean house a bit:

  • Decommission old applications that aren't being maintained
  • Outsource anything you can plausibly outsource to a bigger IT shop
  • Allocate a few programming resources to pay attention to the ones you can't decommission or outsource

Of course, it can't hurt to ask your cloud provider or outsourcer what they're doing to prepare for the coming changes, but if they act surprised by any of them, it may be time to consider a new provider.

Ideally, the coming Internet disturbances should be viewed as an opportunity to streamline some of your oldest, least maintained, most idiosyncratic infrastructure. In a world where there are professionals who can run most of your applications for you, locally or in the cloud, it's probably time for your organization to move beyond worrying about these kinds of changes. Decommission the old stuff, outsource whatever you can, and the coming problems will largely be problems for someone else, not you.

And that's about the best you can hope for as the Internet endures its growing pains.

More Stories By Nathaniel Borenstein

Nathaniel Borenstein is chief scientist for cloud-based email management company Mimecast. At Mimecast, he is responsible for driving the company’s product evolution and technological innovation. Dr. Borenstein is the co-creator of the Multipurpose Internet Mail Extensions (MIME) email standard and developer of the Andrew Mail System, metamail software and the Safe-Tcl programming language.

Previously, Dr. Borenstein worked as an IBM Distinguished Engineer, responsible for research and standards strategy for the Lotus brand, and as a faculty member at the University of Michigan and Carnegie-Mellon University. He also founded two successful Internet cloud service start-ups; First Virtual Holdings, the first Internet payment system; and NetPOS, the first Internet-centric point-of-sale system.

Comments (0)

Share your thoughts on this story.

Add your comment
You must be signed in to add a comment. Sign-in | Register

In accordance with our Comment Policy, we encourage comments that are on topic, relevant and to-the-point. We will remove comments that include profanity, personal attacks, racial slurs, threats of violence, or other inappropriate material that violates our Terms and Conditions, and will block users who make repeated violations. We ask all readers to expect diversity of opinion and to treat one another with dignity and respect.


@CloudExpo Stories
Everything run by electricity will eventually be connected to the Internet. Get ahead of the Internet of Things revolution and join Akvelon expert and IoT industry leader, Sergey Grebnov, in his session at @ThingsExpo, for an educational dive into the world of managing your home, workplace and all the devices they contain with the power of machine-based AI and intelligent Bot services for a completely streamlined experience.
Any startup has to have a clear go –to-market strategy from the beginning. Similarly, any data science project has to have a go to production strategy from its first days, so it could go beyond proof-of-concept. Machine learning and artificial intelligence in production would result in hundreds of training pipelines and machine learning models that are continuously revised by teams of data scientists and seamlessly connected with web applications for tenants and users.
"We're here to tell the world about our cloud-scale infrastructure that we have at Juniper combined with the world-class security that we put into the cloud," explained Lisa Guess, VP of Systems Engineering at Juniper Networks, in this SYS-CON.tv interview at 20th Cloud Expo, held June 6-8, 2017, at the Javits Center in New York City, NY.
"I will be talking about ChatOps and ChatOps as a way to solve some problems in the DevOps space," explained Himanshu Chhetri, CTO of Addteq, in this SYS-CON.tv interview at @DevOpsSummit at 20th Cloud Expo, held June 6-8, 2017, at the Javits Center in New York City, NY.
"We are an IT services solution provider and we sell software to support those solutions. Our focus and key areas are around security, enterprise monitoring, and continuous delivery optimization," noted John Balsavage, President of A&I Solutions, in this SYS-CON.tv interview at 20th Cloud Expo, held June 6-8, 2017, at the Javits Center in New York City, NY.
Your homes and cars can be automated and self-serviced. Why can't your storage? From simply asking questions to analyze and troubleshoot your infrastructure, to provisioning storage with snapshots, recovery and replication, your wildest sci-fi dream has come true. In his session at @DevOpsSummit at 20th Cloud Expo, Dan Florea, Director of Product Management at Tintri, provided a ChatOps demo where you can talk to your storage and manage it from anywhere, through Slack and similar services with...
The financial services market is one of the most data-driven industries in the world, yet it’s bogged down by legacy CPU technologies that simply can’t keep up with the task of querying and visualizing billions of records. In his session at 20th Cloud Expo, Karthik Lalithraj, a Principal Solutions Architect at Kinetica, discussed how the advent of advanced in-database analytics on the GPU makes it possible to run sophisticated data science workloads on the same database that is housing the rich...
DevOps at Cloud Expo, taking place October 31 - November 2, 2017, at the Santa Clara Convention Center in Santa Clara, CA, is co-located with 21st Cloud Expo and will feature technical sessions from a rock star conference faculty and the leading industry players in the world. The widespread success of cloud computing is driving the DevOps revolution in enterprise IT. Now as never before, development teams must communicate and collaborate in a dynamic, 24/7/365 environment. There is no time to w...
"We want to show that our solution is far less expensive with a much better total cost of ownership so we announced several key features. One is called geo-distributed erasure coding, another is support for KVM and we introduced a new capability called Multi-Part," explained Tim Desai, Senior Product Marketing Manager at Hitachi Data Systems, in this SYS-CON.tv interview at 20th Cloud Expo, held June 6-8, 2017, at the Javits Center in New York City, NY.
There is a huge demand for responsive, real-time mobile and web experiences, but current architectural patterns do not easily accommodate applications that respond to events in real time. Common solutions using message queues or HTTP long-polling quickly lead to resiliency, scalability and development velocity challenges. In his session at 21st Cloud Expo, Ryland Degnan, a Senior Software Engineer on the Netflix Edge Platform team, will discuss how by leveraging a reactive stream-based protocol,...
SYS-CON Events announced today that SkyScale will exhibit at SYS-CON's 21st International Cloud Expo®, which will take place on Oct 31 – Nov 2, 2017, at the Santa Clara Convention Center in Santa Clara, CA. SkyScale is a world-class provider of cloud-based, ultra-fast multi-GPU hardware platforms for lease to customers desiring the fastest performance available as a service anywhere in the world. SkyScale builds, configures, and manages dedicated systems strategically located in maximum-securit...
With tough new regulations coming to Europe on data privacy in May 2018, Calligo will explain why in reality the effect is global and transforms how you consider critical data. EU GDPR fundamentally rewrites the rules for cloud, Big Data and IoT. In his session at 21st Cloud Expo, Adam Ryan, Vice President and General Manager EMEA at Calligo, will examine the regulations and provide insight on how it affects technology, challenges the established rules and will usher in new levels of diligence...
As businesses adopt functionalities in cloud computing, it’s imperative that IT operations consistently ensure cloud systems work correctly – all of the time, and to their best capabilities. In his session at @BigDataExpo, Bernd Harzog, CEO and founder of OpsDataStore, presented an industry answer to the common question, “Are you running IT operations as efficiently and as cost effectively as you need to?” He then expounded on the industry issues he frequently came up against as an analyst, and ...
DX World EXPO, LLC., a Lighthouse Point, Florida-based startup trade show producer and the creator of "DXWorldEXPO® - Digital Transformation Conference & Expo" has announced its executive management team. The team is headed by Levent Selamoglu, who has been named CEO. "Now is the time for a truly global DX event, to bring together the leading minds from the technology world in a conversation about Digital Transformation," he said in making the announcement.
Internet of @ThingsExpo, taking place October 31 - November 2, 2017, at the Santa Clara Convention Center in Santa Clara, CA, is co-located with 21st Cloud Expo and will feature technical sessions from a rock star conference faculty and the leading industry players in the world. The Internet of Things (IoT) is the most profound change in personal and enterprise IT since the creation of the Worldwide Web more than 20 years ago. All major researchers estimate there will be tens of billions devic...
What sort of WebRTC based applications can we expect to see over the next year and beyond? One way to predict development trends is to see what sorts of applications startups are building. In his session at @ThingsExpo, Arin Sime, founder of WebRTC.ventures, discussed the current and likely future trends in WebRTC application development based on real requests for custom applications from real customers, as well as other public sources of information.
"The Striim platform is a full end-to-end streaming integration and analytics platform that is middleware that covers a lot of different use cases," explained Steve Wilkes, Founder and CTO at Striim, in this SYS-CON.tv interview at 20th Cloud Expo, held June 6-8, 2017, at the Javits Center in New York City, NY.
SYS-CON Events announced today that Calligo, an innovative cloud service provider offering mid-sized companies the highest levels of data privacy and security, has been named "Bronze Sponsor" of SYS-CON's 21st International Cloud Expo ®, which will take place on Oct 31 - Nov 2, 2017, at the Santa Clara Convention Center in Santa Clara, CA. Calligo offers unparalleled application performance guarantees, commercial flexibility and a personalised support service from its globally located cloud plat...
In his session at 20th Cloud Expo, Mike Johnston, an infrastructure engineer at Supergiant.io, discussed how to use Kubernetes to set up a SaaS infrastructure for your business. Mike Johnston is an infrastructure engineer at Supergiant.io with over 12 years of experience designing, deploying, and maintaining server and workstation infrastructure at all scales. He has experience with brick and mortar data centers as well as cloud providers like Digital Ocean, Amazon Web Services, and Rackspace. H...
SYS-CON Events announced today that Massive Networks will exhibit at SYS-CON's 21st International Cloud Expo®, which will take place on Oct 31 – Nov 2, 2017, at the Santa Clara Convention Center in Santa Clara, CA. Massive Networks mission is simple. To help your business operate seamlessly with fast, reliable, and secure internet and network solutions. Improve your customer's experience with outstanding connections to your cloud.