|By Nathaniel Borenstein||
|August 30, 2010 07:15 AM EDT||
The Internet is quietly being replumbed. That shouldn't surprise anyone involved with it; the Internet is always being replumbed. But you might be more surprised to learn that the next few years will bring an unusual burst of changes in that plumbing, some with great potential consequences for anyone who relies on the Net.
By "plumbing," I of course refer to the protocols and software that make the core features of the Internet work. These have been evolving steadily since 1969, but I don't think any period since the early 1980s has seen as many changes as we'll see over the next few years.
Like anything new, these changes will bring both threats and opportunities - but in this case, probably more threats than opportunities. Each critical part of your organization's infrastructure is potentially at risk from any fundamental change, and there will be several such changes in succession.
The Next Big Things
For years, experts have warned that the Domain Name System, one of the most important subsystems on the Internet, is at severe risk from malicious actors. All sorts of schemes are possible if you can hijack someone else's domain name, and there are many ways to accomplish that hijacking. DNSSEC makes domain hijacking much, much harder, and therefore makes it more reasonable to trust the identities of Internet sites. It is the foundation for a more trusted net.
After years of work, a milestone was reached in 2010 when the root domain was signed with DNSSEC. Over the next few years, more and more sites will try to protect their identities and reputations with DNSSEC. The potential for breaking older or unusual DNS implementations can't be ignored, but any organization that has a lot invested in its domain name should consider using DNSSEC to protect it from hijacking and to reassure end users.
The TCP/IP protocols were designed to facilitate what almost everyone thought was an absurdly big network - over 4 billion computers. Less than 30 years later, we all know (as I said in 1983, mostly to dismissive laughter) that the 4 billion addresses enabled by IPv4 are simply not enough. To keep the Net from fragmenting, to facilitate universal communication, and to avoid having the Net's growth stop dead in its tracks, it is essential that the world convert to IPv6.
Adoption of IPv6 has been slow, but there's a good reason to expect that to change: halfway through 2011, the supply of IPv4 addresses will simply run out. There are all sorts of half-measures and hacks that can postpone things a bit further, but by now it's clear that the future of the Internet requires IPv6. Despite the many person-centuries of work that have gone into IPv6, the transition is highly unlikely to be smooth and painless for everyone.
International Email Addresses
For as long as there has been Internet email, addresses have been limited to the ASCII character set. Spanish speakers can't use the letter "ñ" even if it's part of their name, and Germans similarly have to do without their "ö." They've been remarkably patient with what is, from their perspective, a gross inadequacy in the email standards. But the people who have it worst are Asians, as their characters are forbidden in traditional email addresses. What the world wants are email addresses like these:
After many years of wishing, arguing and working, the IETF is closing in on a solution. Internationalized domain names (the right-hand side of the email address) have been a reality for a little while now, and the IETF has been tackling the final bit, the left hand side. This turns out to be much, much, much harder than it sounds, because of the problem of backward-compatibility with the old standards and all the old mailers in the world.
The solution is going to be ugly, but functional. New encodings map ugly strings like "xn-bcher-kva.ch" onto desired internationalized forms such as "Bücher.ch." Ideally, users will never see the ugly forms, which are designed to be backwards-compatible, but inevitably they sometimes will. Worse still, it may be impossible for a user of older software to reply to email from someone with an internationalized address.
The bottom line: we'll be going through a period during which email will probably not be quite as universal, or as stable, as we're accustomed to it being. Anyone with responsibility for software that processes email addresses will need to make sure that their software doesn't do horrible things when these new forms of addresses are encountered.
The fight against spam will never end, because the miracle of Moore's Law - the same miracle that gives us ever smaller and more powerful computing devices - operates in favor of the spammers. Every time we get twice as good at detecting spam, spammers are able to generate twice as much spam for the same price, which means that the good guys are running on a treadmill, needing to work continuously just to avoid falling behind.
One manifestation of that hard work is the DKIM standard, which stands for "Domain Keys Identified Mail." This specifies a procedure by which organizations can publish cryptographic keys and sign all its outgoing mail, thus making it somewhat easier to be sure where some messages really originate. It's far from a cure-all, but it has the potential - particularly when paired with as-yet-undefined reputation systems - to make it easier to detect spam with forged sender information, the issue at the heart of the "phishing" problem.
DKIM has been in development for several years now, and is progressing well through the standards process. It should be mostly invisible to end users, but will keep mail system administrators busy for a while. As they learn to configure outgoing mail for signatures and to check incoming mail for signatures, there is a strong potential for destabilizing the email environment in general. The most likely issue will be mail that just doesn't reach its intended recipient. That's a much higher risk during the period that DKIM - or really, any other anti-spam standards and technologies - are being newly deployed.
High on nearly everyone's list, in the wake of technologies such as DKIM, are reputation services - trusted parties that can tell you if a message is signed as being from Joe.com, whether or not Joe.com is known for sending spam or other bad things over the Internet.
Although there are no standards for reputation services yet - and although they are undeniably needed - we can already see the risks and benefits by looking at the non-standardized reputation services in use today, notably blacklists of email senders. Although these are incredibly useful, there is a never-ending stream of problems with organizations that get added to such lists inappropriately, and the administrative difficulties of getting them removed promptly.
Similar considerations will surely apply to the standardized reputation services of the future - no such service can be any better than the support organization that deals with exceptions and problems. Any progress with reputation standards should be expected to be accompanied by transitional pains as the reputation service bureaus mature and develop good or bad reputations themselves.
What Can Customers Do?
Make no mistake, the coming improvements to the Internet's plumbing are a very good thing. But the implementation of each of them brings with it the potential for destabilizing various aspects of the Internet infrastructure, despite the heroic efforts of the IETF to minimize that risk.
Vendors can increase or reduce the risk through their quality of implementation. What can customers do?
Paradoxically, the answer is to do more by doing less. The biggest risks are inevitably found in the least professionally administered software and servers. The big cloud providers with the staff of crack programmers and administrators are at the least risk, because they understand the risks well enough to take steps far in advance. But that specialized application that your predecessor commissioned 10 years ago, and is now running more or less autonomously on an ancient server in your headquarters, could represent a huge risk.
Basically, the risk is highest where the least attention is being paid. The best thing that most organizations can do, in preparation for the coming instabilities, is to use fear of the unknown as an excuse to clean house a bit:
- Decommission old applications that aren't being maintained
- Outsource anything you can plausibly outsource to a bigger IT shop
- Allocate a few programming resources to pay attention to the ones you can't decommission or outsource
Of course, it can't hurt to ask your cloud provider or outsourcer what they're doing to prepare for the coming changes, but if they act surprised by any of them, it may be time to consider a new provider.
Ideally, the coming Internet disturbances should be viewed as an opportunity to streamline some of your oldest, least maintained, most idiosyncratic infrastructure. In a world where there are professionals who can run most of your applications for you, locally or in the cloud, it's probably time for your organization to move beyond worrying about these kinds of changes. Decommission the old stuff, outsource whatever you can, and the coming problems will largely be problems for someone else, not you.
And that's about the best you can hope for as the Internet endures its growing pains.
The financial services market is one of the most data-driven industries in the world, yet it’s bogged down by legacy CPU technologies that simply can’t keep up with the task of querying and visualizing billions of records. In his session at 20th Cloud Expo, Jared Parker, Director of Financial Services at Kinetica, will discuss how the advent of advanced in-database analytics on the GPU makes it possible to run sophisticated data science workloads on the same database that is housing the rich inf...
Mar. 25, 2017 10:45 PM EDT Reads: 3,608
For organizations that have amassed large sums of software complexity, taking a microservices approach is the first step toward DevOps and continuous improvement / development. Integrating system-level analysis with microservices makes it easier to change and add functionality to applications at any time without the increase of risk. Before you start big transformation projects or a cloud migration, make sure these changes won’t take down your entire organization.
Mar. 25, 2017 09:45 PM EDT Reads: 3,546
My team embarked on building a data lake for our sales and marketing data to better understand customer journeys. This required building a hybrid data pipeline to connect our cloud CRM with the new Hadoop Data Lake. One challenge is that IT was not in a position to provide support until we proved value and marketing did not have the experience, so we embarked on the journey ourselves within the product marketing team for our line of business within Progress. In his session at @BigDataExpo, Sum...
Mar. 25, 2017 08:45 PM EDT Reads: 2,746
Apache Hadoop is emerging as a distributed platform for handling large and fast incoming streams of data. Predictive maintenance, supply chain optimization, and Internet-of-Things analysis are examples where Hadoop provides the scalable storage, processing, and analytics platform to gain meaningful insights from granular data that is typically only valuable from a large-scale, aggregate view. One architecture useful for capturing and analyzing streaming data is the Lambda Architecture, represent...
Mar. 25, 2017 08:45 PM EDT Reads: 5,929
SYS-CON Events announced today that Ocean9will exhibit at SYS-CON's 20th International Cloud Expo®, which will take place on June 6-8, 2017, at the Javits Center in New York City, NY. Ocean9 provides cloud services for Backup, Disaster Recovery (DRaaS) and instant Innovation, and redefines enterprise infrastructure with its cloud native subscription offerings for mission critical SAP workloads.
Mar. 25, 2017 05:15 PM EDT Reads: 1,925
Adding public cloud resources to an existing application can be a daunting process. The tools that you currently use to manage the software and hardware outside the cloud aren’t always the best tools to efficiently grow into the cloud. All of the major configuration management tools have cloud orchestration plugins that can be leveraged, but there are also cloud-native tools that can dramatically improve the efficiency of managing your application lifecycle.
Mar. 25, 2017 04:00 PM EDT Reads: 2,829
DevOps is often described as a combination of technology and culture. Without both, DevOps isn't complete. However, applying the culture to outdated technology is a recipe for disaster; as response times grow and connections between teams are delayed by technology, the culture will die. A Nutanix Enterprise Cloud has many benefits that provide the needed base for a true DevOps paradigm. In his Day 3 Keynote at 20th Cloud Expo, Chris Brown, a Solutions Marketing Manager at Nutanix, will explore t...
Mar. 25, 2017 03:00 PM EDT Reads: 2,716
SYS-CON Events announced today that Technologic Systems Inc., an embedded systems solutions company, will exhibit at SYS-CON's @ThingsExpo, which will take place on June 6-8, 2017, at the Javits Center in New York City, NY. Technologic Systems is an embedded systems company with headquarters in Fountain Hills, Arizona. They have been in business for 32 years, helping more than 8,000 OEM customers and building over a hundred COTS products that have never been discontinued. Technologic Systems’ pr...
Mar. 25, 2017 01:45 PM EDT Reads: 3,284
SYS-CON Events announced today that CA Technologies has been named “Platinum Sponsor” of SYS-CON's 20th International Cloud Expo®, which will take place on June 6-8, 2017, at the Javits Center in New York City, NY, and the 21st International Cloud Expo®, which will take place October 31-November 2, 2017, at the Santa Clara Convention Center in Santa Clara, CA. CA Technologies helps customers succeed in a future where every business – from apparel to energy – is being rewritten by software. From ...
Mar. 25, 2017 01:30 PM EDT Reads: 1,706
The taxi industry never saw Uber coming. Startups are a threat to incumbents like never before, and a major enabler for startups is that they are instantly “cloud ready.” If innovation moves at the pace of IT, then your company is in trouble. Why? Because your data center will not keep up with frenetic pace AWS, Microsoft and Google are rolling out new capabilities In his session at 20th Cloud Expo, Don Browning, VP of Cloud Architecture at Turner, will posit that disruption is inevitable for c...
Mar. 25, 2017 01:15 PM EDT Reads: 2,039
SYS-CON Events announced today that Cloudistics, an on-premises cloud computing company, has been named “Bronze Sponsor” of SYS-CON's 20th International Cloud Expo®, which will take place on June 6-8, 2017, at the Javits Center in New York City, NY. Cloudistics delivers a complete public cloud experience with composable on-premises infrastructures to medium and large enterprises. Its software-defined technology natively converges network, storage, compute, virtualization, and management into a ...
Mar. 25, 2017 12:45 PM EDT Reads: 1,874
Deep learning has been very successful in social sciences and specially areas where there is a lot of data. Trading is another field that can be viewed as social science with a lot of data. With the advent of Deep Learning and Big Data technologies for efficient computation, we are finally able to use the same methods in investment management as we would in face recognition or in making chat-bots. In his session at 20th Cloud Expo, Gaurav Chakravorty, co-founder and Head of Strategy Development ...
Mar. 25, 2017 12:45 PM EDT Reads: 3,477
SYS-CON Events announced today that Loom Systems will exhibit at SYS-CON's 20th International Cloud Expo®, which will take place on June 6-8, 2017, at the Javits Center in New York City, NY. Founded in 2015, Loom Systems delivers an advanced AI solution to predict and prevent problems in the digital business. Loom stands alone in the industry as an AI analysis platform requiring no prior math knowledge from operators, leveraging the existing staff to succeed in the digital era. With offices in S...
Mar. 25, 2017 12:30 PM EDT Reads: 1,171
The explosion of new web/cloud/IoT-based applications and the data they generate are transforming our world right before our eyes. In this rush to adopt these new technologies, organizations are often ignoring fundamental questions concerning who owns the data and failing to ask for permission to conduct invasive surveillance of their customers. Organizations that are not transparent about how their systems gather data telemetry without offering shared data ownership risk product rejection, regu...
Mar. 25, 2017 12:30 PM EDT Reads: 5,058
What if you could build a web application that could support true web-scale traffic without having to ever provision or manage a single server? Sounds magical, and it is! In his session at 20th Cloud Expo, Chris Munns, Senior Developer Advocate for Serverless Applications at Amazon Web Services, will show how to build a serverless website that scales automatically using services like AWS Lambda, Amazon API Gateway, and Amazon S3. We will review several frameworks that can help you build serverle...
Mar. 25, 2017 12:30 PM EDT Reads: 1,805
SYS-CON Events announced today that SoftLayer, an IBM Company, has been named “Gold Sponsor” of SYS-CON's 18th Cloud Expo, which will take place on June 7-9, 2016, at the Javits Center in New York, New York. SoftLayer, an IBM Company, provides cloud infrastructure as a service from a growing number of data centers and network points of presence around the world. SoftLayer’s customers range from Web startups to global enterprises.
Mar. 25, 2017 11:15 AM EDT Reads: 1,537
Interoute has announced the integration of its Global Cloud Infrastructure platform with Rancher Labs’ container management platform, Rancher. This approach enables enterprises to accelerate their digital transformation and infrastructure investments. Matthew Finnie, Interoute CTO commented “Enterprises developing and building apps in the cloud and those on a path to Digital Transformation need Digital ICT Infrastructure that allows them to build, test and deploy faster than ever before. The int...
Mar. 25, 2017 11:15 AM EDT Reads: 834
SYS-CON Events announced today that CrowdReviews.com has been named “Media Sponsor” of SYS-CON's 20th International Cloud Expo, which will take place on June 6–8, 2017, at the Javits Center in New York City, NY. CrowdReviews.com is a transparent online platform for determining which products and services are the best based on the opinion of the crowd. The crowd consists of Internet users that have experienced products and services first-hand and have an interest in letting other potential buyers...
Mar. 25, 2017 11:00 AM EDT Reads: 3,552
SYS-CON Events announced today that T-Mobile will exhibit at SYS-CON's 20th International Cloud Expo®, which will take place on June 6-8, 2017, at the Javits Center in New York City, NY. As America's Un-carrier, T-Mobile US, Inc., is redefining the way consumers and businesses buy wireless services through leading product and service innovation. The Company's advanced nationwide 4G LTE network delivers outstanding wireless experiences to 67.4 million customers who are unwilling to compromise on ...
Mar. 25, 2017 10:45 AM EDT Reads: 2,091
@DevOpsSummit has been named the ‘Top DevOps Influencer' by iTrend. iTrend processes millions of conversations, tweets, interactions, news articles, press releases, blog posts - and extract meaning form them and analyzes mobile and desktop software platforms used to communicate, various metadata (such as geo location), and automation tools. In overall placement, @DevOpsSummit ranked as the number one ‘DevOps Influencer' followed by @CloudExpo at third, and @MicroservicesE at 24th.
Mar. 25, 2017 10:30 AM EDT Reads: 10,271