Welcome!

@CloudExpo Authors: Zakia Bouachraoui, Elizabeth White, Yeshim Deniz, Liz McMillan, Pat Romanski

Related Topics: @CloudExpo

@CloudExpo: Blog Post

Welcome To The Realtime Intercloud

Mining the intercloud in realtime

In the past it was so much easier. Search engines could crawl the web at a leisurely pace, clean up the data, build indexes, and every so often provide a new, improved search experience with more web pages covered. That was the internet, the old non-realtime internet. Today it’s different! Not only are people interested in a lot more than just web pages, they want to see everything LIVE, IN REALTIME. NO DELAYS. NO LATENCY. The solutions that were fine for the old web just don’t cut it for the realtime web.

It’s the same with enterprise applications and services. In the past all that was needed was a relational database or data warehouse. Just have the database administrator add in new data once every few days and prepare the database so that we can run our SQL queries again to see what has changed since we last run them a week ago. Today, many of the key performance indicators and other metrics of critical relevance to a business need to be understood and analyzed every few minutes, and in the case of some businesses, every few seconds. NO DELAYS. NO LATENCY. Usually that means NO SQL.

Mining The Intercloud In Realtime
As massive historical data sets and torrential realtime data streams flow into public and private clouds around the planet, the intercloud becomes essential to support new applications and services that are able to run across these clouds. The alternative would be to move these huge data sets and streams to a single location simply in order to compute with them. Moving many terabytes of data across today’s internet means huge bandwidth costs and unnecessary storage increases. Even more importantly it means enormous latency. For non-realtime apps and services this is very bad, for realtime apps it is simply not an option.

With intercloud apps and services we can keep the data sets and streams where they are, and instead move the compute power close to the data. The following simple examples show the power of intercloud computing.

In order to be able to trade optimally, a portfolio manager or trader needs deep insight into not only what is currently going on in the markets (patterns, trends etc.) but also what is happening in terms of patterns and trends in news, blogs, tweets, SEC/govt filings etc. This information will, in general, be held on many public clouds, private clouds and private datacenters around the world. Unless the trader’s organization is willing to incur the huge costs involved in having ALL of this realtime data instantly available in-house, the only alternative is to use an intercloud infrastructure which can reach out and encompass all these data sources. Even if the organization was willing to fund such an in-house-only architecture initially, the relentless growth of new realtime data sources makes this an impractical strategy.

The same applies to businesses looking to optimize customer experience by analyzing and correlating live data from web clickstreams with realtime data from monitoring tools spread across the company’s various clouds, datacenters and networks. The onset of a network problem in one small area of the company’s global infrastructure may be detected and quickly corrected by looking not only at the network monitoring data but also at realtime changes in customer experience as evidenced by a drop in completion of sales detected in realtime web clickstreams. As more and more of business becomes instrumented, the number of streams to be analyzed and correlated will grow correspondingly, driving the need for realtime intercloud computing.

These are just two simple examples of the way in which we are headed with the intercloud, as we look to move beyond the existing internet to a new model better suited to the challenges of delivering a realtime web for how we live and work today.

More Stories By Bill McColl

Bill McColl left Oxford University to found Cloudscale. At Oxford he was Professor of Computer Science, Head of the Parallel Computing Research Center, and Chairman of the Computer Science Faculty. Along with Les Valiant of Harvard, he developed the BSP approach to parallel programming. He has led research, product, and business teams, in a number of areas: massively parallel algorithms and architectures, parallel programming languages and tools, datacenter virtualization, realtime stream processing, big data analytics, and cloud computing. He lives in Palo Alto, CA.

CloudEXPO Stories
Having been in the web hosting industry since 2002, dhosting has gained a great deal of experience while working on a wide range of projects. This experience has enabled the company to develop our amazing new product, which they are now excited to present! Among dHosting's greatest achievements, they can include the development of their own hosting panel, the building of their fully redundant server system, and the creation of dhHosting's unique product, Dynamic Edge.
Your job is mostly boring. Many of the IT operations tasks you perform on a day-to-day basis are repetitive and dull. Utilizing automation can improve your work life, automating away the drudgery and embracing the passion for technology that got you started in the first place. In this presentation, I'll talk about what automation is, and how to approach implementing it in the context of IT Operations. Ned will discuss keys to success in the long term and include practical real-world examples. Get started on automating your way to a brighter future!
The challenges of aggregating data from consumer-oriented devices, such as wearable technologies and smart thermostats, are fairly well-understood. However, there are a new set of challenges for IoT devices that generate megabytes or gigabytes of data per second. Certainly, the infrastructure will have to change, as those volumes of data will likely overwhelm the available bandwidth for aggregating the data into a central repository. Ochandarena discusses a whole new way to think about your next-gen applications and how to address the challenges of building applications that harness all data types and sources.
Lori MacVittie is a subject matter expert on emerging technology responsible for outbound evangelism across F5's entire product suite. MacVittie has extensive development and technical architecture experience in both high-tech and enterprise organizations, in addition to network and systems administration expertise. Prior to joining F5, MacVittie was an award-winning technology editor at Network Computing Magazine where she evaluated and tested application-focused technologies including app security and encryption-related solutions. She holds a B.S. in Information and Computing Science from the University of Wisconsin at Green Bay, and an M.S. in Computer Science from Nova Southeastern University, and is an O'Reilly author.
CloudEXPO New York 2018, colocated with DevOpsSUMMIT and DXWorldEXPO New York 2018 will be held November 12-13, 2018, in New York City and will bring together Cloud Computing, FinTech and Blockchain, Digital Transformation, Big Data, Internet of Things, DevOps, AI and Machine Learning to one location.