Welcome!

@CloudExpo Authors: Yeshim Deniz, Elizabeth White, Pat Romanski, Liz McMillan, Zakia Bouachraoui

Related Topics: @CloudExpo

@CloudExpo: Blog Post

Cloud Computing: The Need For Speed

Who Is Going To Build The Low-Latency Cloud for Enterprise Customers?

The spectacular predictions for cloud revenue growth over the coming decade depend critically on the success of cloud providers in persuading major enterprise customers to deploy many of their mission-critical systems in the cloud rather than in-house. Today's cloud users are overwhelmingly web companies, for whom the low cost of cloud computing, together with simple elastic scaling are the critical factors. But major enterprises are different. In looking at the cloud option, their first major concern is security. This is well documented, has been discussed to death in cloud blogs and conferences, and is certainly a real problem. However, I think we can be very confident that the cloud vendors will be able to quickly overcome the concerns in this area. The reality, although not currently the perception, is that many/most in-house systems are no more secure than cloud systems. The perception of cloud security will change quickly over the next couple of years, as more cloud services are deployed. There are certainly no significant technical impediments here. Many of the security technologies used to protect in-house datacenters can be adapted to secure cloud services.

A second major concern for major enterprises is reliability, but here again we can be confident that the perception on this issue will rapidly change, as cloud vendors continue to improve the reliability of their datacenter and network architectures. Again, no major technical impediments here.

In comparison to security and reliability, the third major concern around enterprise cloud computing has received almost no attention so far. That concern is speed, or rather the lack of it in the cloud. In short, the cloud is slooooooooowww, and as multicore moves to manycore, as the cost of super-fast SSD storage solutions plummet, the difference in speed between cloud computing and in-house systems will widen dramatically unless there is a serious shift in the kinds of cloud services on offer from vendors to high end users.

Consider the most basic of all enterprise services, online transaction processing (OLTP). This is at the heart of all ecommerce and banking. Modern OLTP requires that web transactions and business transactions are processed in a fraction of a second, and that the OLTP system has the scalability and performance to handle even the heaviest loads with this very low latency. But it's even more challenging than that. OLTP systems need to be tightly coupled to complex realtime fraud detection systems that operate with the same very low latency. Can anyone see VISA or American Express moving to the cloud as we know it today?

Amazon Web Services is the main cloud provider today. At Cloudscale we are significant and enthusiastic users of the numerous AWS cloud services. The Amazon team have built an architecture that is very low cost and ideally suited to supporting web companies where the transactions involved are mainly serving up data rather than performing complex analysis. For these simple types of website data delivery, storage services such as S3 offer acceptable performance. But what about modern enterprise apps?

The next generation of enterprise apps will revolve around two major themes: big data and realtime. Even the most traditional of enterprise software vendors, SAP, recognizes this. Hasso Plattner, the company's founder, has been saying for the past couple of years that the future of enterprise software will be based on in-memory database processing that can deliver realtime insight on huge volumes of data.

At Cloudscale we have developed a new realtime data warehouse model that enables both realtime ETL and realtime analytics on big data. The model and in-memory architecture is designed to allow complex realtime analysis, filtering and transformation of big data with latencies of at most a few seconds. This contrasts with traditional data warehouse architectures or parallel tools such as MapReduce/Hadoop, where latencies are usually measured in hours.

In contrast to other companies, and in keeping with our company name!, we have initially deployed the realtime data warehouse in the cloud before deploying it on in-house systems. It is offered today as the Cloudcel service, and currently hosted on AWS.

Later in the year will be launching the Cloudscale R1 architecture for in-house deployment. For this R1 "supercruncher" we have an astonishing array of technology choices available at very aggressive price points: manycore chips, 10GigE or Infiniband networking, massive memory sizes, SSD storage solutions with phenomenal bandwidth (up to 1TB/sec at scale). But more important than all of that is that users will not have to go across the internet to get to it.

This is the technology world we're moving into. Will cloud services be able to keep up, to stay in the race? It's clear that today's cloud services are not optimized for this kind of enterprise computing. Will new cloud offerings emerge? Who will deliver them? Will it be the networking vendors or the telcos? Clearly the central problem in all of this is latency. Who is going to build the low-latency cloud for enterprise customers? If it's you, then we would love to hear about it at Cloudscale. We may just have the killer app for you!

More Stories By Bill McColl

Bill McColl left Oxford University to found Cloudscale. At Oxford he was Professor of Computer Science, Head of the Parallel Computing Research Center, and Chairman of the Computer Science Faculty. Along with Les Valiant of Harvard, he developed the BSP approach to parallel programming. He has led research, product, and business teams, in a number of areas: massively parallel algorithms and architectures, parallel programming languages and tools, datacenter virtualization, realtime stream processing, big data analytics, and cloud computing. He lives in Palo Alto, CA.

CloudEXPO Stories
Andi Mann, Chief Technology Advocate at Splunk, is an accomplished digital business executive with extensive global expertise as a strategist, technologist, innovator, marketer, and communicator. For over 30 years across five continents, he has built success with Fortune 500 corporations, vendors, governments, and as a leading research analyst and consultant.
Most DevOps journeys involve several phases of maturity. Research shows that the inflection point where organizations begin to see maximum value is when they implement tight integration deploying their code to their infrastructure. Success at this level is the last barrier to at-will deployment. Storage, for instance, is more capable than where we read and write data. In his session at @DevOpsSummit at 20th Cloud Expo, Josh Atwell, a Developer Advocate for NetApp, will discuss the role and value extensible storage infrastructure has in accelerating software development activities, improve code quality, reveal multiple deployment options through automated testing, and support continuous integration efforts. All this will be described using tools common in DevOps organizations.
"When you think about the data center today, there's constant evolution, The evolution of the data center and the needs of the consumer of technology change, and they change constantly," stated Matt Kalmenson, VP of Sales, Service and Cloud Providers at Veeam Software, in this SYS-CON.tv interview at 18th Cloud Expo, held June 7-9, 2016, at the Javits Center in New York City, NY.
Dynatrace is an application performance management software company with products for the information technology departments and digital business owners of medium and large businesses. Building the Future of Monitoring with Artificial Intelligence Today we can collect lots and lots of performance data. We build beautiful dashboards and even have fancy query languages to access and transform the data. Still performance data is a secret language only a couple of people understand. The more business becomes digital the more stakeholders are interested in this data including how it relates to business. Some of these people have never used a monitoring tool before. They have a question on their mind like "How is my application doing" but no idea how to get a proper answer.
Today, we have more data to manage than ever. We also have better algorithms that help us access our data faster. Cloud is the driving force behind many of the data warehouse advancements we have enjoyed in recent years. But what are the best practices for storing data in the cloud for machine learning and data science applications?