Welcome!

@CloudExpo Authors: Yeshim Deniz, Zakia Bouachraoui, Liz McMillan, Pat Romanski, William Schmarzo

Related Topics: @CloudExpo, Microservices Expo, Containers Expo Blog

@CloudExpo: Article

Big Data – A Sea Change of Capabilities in IT

An exclusive Q&A with Matt McLarty, Vice President, Client Solutions at Layer 7 Technologies

"Big data represents a sea change of capabilities in IT" notes Matt McLarty, Vice President, Client Solutions at Layer 7 Technologies, in this exclusive Q&A with Cloud Expo Conference Chair Jeremy Geelan. McLarty continued: "In conjunction with mobile and cloud, I think Big Data will provide a technological makeover to the typical enterprise infrastructure, drawing a hard API border in front of core business services while blurring the line between logic and data services."

Cloud Computing Journal: Agree or disagree? - "While the IT savings aspect is compelling, the strongest benefit of cloud computing is how it enhances business agility."

Matt McLarty: Agree. We have a number of customers who are able to use Layer 7 Gateways to protect their cloud deployments, and leverage the elastic scaling model of the cloud to handle seasonal or sporadic bursts of traffic dynamically. Historically, these companies would have to try and forecast this and risk over-buying infrastructure. So there is a big cost savings, but dynamic scaling is a new capability that only comes with the cloud model.


The Layer 7 booth at 10th Cloud Expo | Cloud Expo New York

Cloud Computing Journal: Which of the recent big acquisitions within the Cloud and/or Big Data space have most grabbed your attention as a sign of things to come?

McLarty: What's grabbed my attention most is the fact that the Big Data - and specifically Hadoop - world is so raw that acquisition targets don't even exist. In its place, we've seen an unprecedented talent acquisition spree for anyone with Hadoop experience and data science skills. Big data represents a sea change of capabilities in IT and will have an impact on people, process and tools. In conjunction with mobile and cloud, I think Big Data will provide a technological makeover to the typical enterprise infrastructure, drawing a hard API border in front of core business services while blurring the line between logic and data services.

Cloud Computing Journal: In its recent "Sizing the Cloud" report Forrester Research said it expects the global cloud computing market to reach $241BN in 2020 compared to $40.7BN in 2010 - is that kind of rapid growth trajectory being reflected in your own company or in your view is the Forrester number a tad over-optimistic?

McLarty: Of course, this comes down to what people define as "cloud computing." Are traditional ASPs already being branded as cloud providers? Regardless, there are enough dimensions of growth for cloud - migration of COTS offerings to SaaS, globalization, support for mobile channels and big data - to justify an order of magnitude in a decade. It is certainly reflected in the growth of Layer7's business, and I'm sure there are more daring projections out there in the blogosphere.

Cloud Computing Journal: Which do you think is the most important cloud computing standard still to tackle?

McLarty: I think a standard/syntax for auto-provisioning cloud services would be quite useful. As I said earlier, much of the unique value of cloud comes from the ability to spec the infrastructure dynamically. Having the ability to migrate or balance workloads across a hybrid or federated cloud would be powerful for companies, but it would undoubtedly be met by resistance from the cloud providers and from the niche companies that have built a business around such a service.

Cloud Computing Journal: Big Data has existed since the early days of computing; why, then, do you think there is such an industry buzz around it right now?

McLarty: Like many technological innovations, Big Data has to have a lot of things coming together to make it appetizing to the mainstream. I remember seeing Sony HDTVs around 1990, but it wasn't until around 2005 that there was a critical mass of content, network capability and parts commoditization to make it palatable for the masses. The same thing is happening with Big Data: we now have the network bandwidth, distributed computing power and caching technology to make unstructured, fragmented data retrieval practical. And most of all we have the burning platform; we have simply outgrown our relational indexing capabilities.

Cloud Computing Journal: Do you think Big Data will only ever be used for analytical purposes, or do you envisage that it will actually enable new products?

McLarty: I believe that Big Data has the potential to augment all existing IT interactions. I would answer a slightly different question: if analytics are now available in a real-time context, how can they be used to augment other business and IT services? In the world of real-time integration - the world Layer 7 thrives in - we have seen an industry build out around Event-Driven Architecture, and consequentially seen that solution area integrate with SOA. Big Data can drastically change that game, and I envision a post-Big Data enterprise integration landscape where real-time business services are analytics-enriched, exposed through secure APIs, and accessible to mobile devices, web apps, and B2B consumers.

More Stories By Jeremy Geelan

Jeremy Geelan is Chairman & CEO of the 21st Century Internet Group, Inc. and an Executive Academy Member of the International Academy of Digital Arts & Sciences. Formerly he was President & COO at Cloud Expo, Inc. and Conference Chair of the worldwide Cloud Expo series. He appears regularly at conferences and trade shows, speaking to technology audiences across six continents. You can follow him on twitter: @jg21.

Comments (1) View Comments

Share your thoughts on this story.

Add your comment
You must be signed in to add a comment. Sign-in | Register

In accordance with our Comment Policy, we encourage comments that are on topic, relevant and to-the-point. We will remove comments that include profanity, personal attacks, racial slurs, threats of violence, or other inappropriate material that violates our Terms and Conditions, and will block users who make repeated violations. We ask all readers to expect diversity of opinion and to treat one another with dignity and respect.


Most Recent Comments
Matt McLarty 05/02/12 03:40:00 PM EDT

For more info...

Follow me on Twitter
See our website

CloudEXPO Stories
For years the world's most security-focused and distributed organizations - banks, military/defense agencies, global enterprises - have sought to adopt cloud technologies that can reduce costs, future-proof against data growth, and improve user productivity. The challenges of cloud transformation for these kinds of secure organizations have centered around data security, migration from legacy systems, and performance. In our presentation, we will discuss the notion that cloud computing, properly managed, is poised to bring about a digital transformation to enterprise IT. We will discuss the trend, the technology and the timeline for adoption.
The deluge of IoT sensor data collected from connected devices and the powerful AI required to make that data actionable are giving rise to a hybrid ecosystem in which cloud, on-prem and edge processes become interweaved. Attendees will learn how emerging composable infrastructure solutions deliver the adaptive architecture needed to manage this new data reality. Machine learning algorithms can better anticipate data storms and automate resources to support surges, including fully scalable GPU-centric compute for the most data-intensive applications. Hyperconverged systems already in place can be revitalized with vendor-agnostic, PCIe-deployed, disaggregated approach to composable, maximizing the value of previous investments.
To Really Work for Enterprises, MultiCloud Adoption Requires Far Better and Inclusive Cloud Monitoring and Cost Management … But How? Overwhelmingly, even as enterprises have adopted cloud computing and are expanding to multi-cloud computing, IT leaders remain concerned about how to monitor, manage and control costs across hybrid and multi-cloud deployments. It’s clear that traditional IT monitoring and management approaches, designed after all for on-premises data centers, are falling short in this new hybrid and dynamic environment.
With more than 30 Kubernetes solutions in the marketplace, it's tempting to think Kubernetes and the vendor ecosystem has solved the problem of operationalizing containers at scale or of automatically managing the elasticity of the underlying infrastructure that these solutions need to be truly scalable. Far from it. There are at least six major pain points that companies experience when they try to deploy and run Kubernetes in their complex environments. In this presentation, the speaker will detail these pain points and explain how cloud can address them.
When applications are hosted on servers, they produce immense quantities of logging data. Quality engineers should verify that apps are producing log data that is existent, correct, consumable, and complete. Otherwise, apps in production are not easily monitored, have issues that are difficult to detect, and cannot be corrected quickly. Tom Chavez presents the four steps that quality engineers should include in every test plan for apps that produce log output or other machine data. Learn the steps so your team's apps not only function but also can be monitored and understood from their machine data when running in production.