Click here to close now.

Welcome!

Cloud Expo Authors: Carmen Gonzalez, Elizabeth White, Lori MacVittie, Mike Kavis, Plutora Blog

Related Topics: Cloud Expo, Java, SOA & WOA, .NET, Virtualization

Cloud Expo: Article

Cloud Computing: Knowledge Leads to a Change in Thinking

An exclusive Q&A with Chetan Patwardhan, CEO of Stratogent

"The basic premise for any central computing system optimized for mass consumption is the 80/20 rule. It can be built only to serve 80% of the needs in an economized and optimized fashion," noted Chetan Patwardhan, CEO of Stratogent, in this exclusive Q&A with Cloud Expo Conference Chair Jeremy Geelan. "Having said that," Patwardhan continued, "the so-called cloud economics work only for a certain type of system and is outright prohibitively expensive for most enterprise setups where a typical three-year timeframe cost view is more dependent on human labor than on the infrastructure."

Cloud Computing Journal: Just having the enterprise data is good. Extracting meaningful information out of this data is priceless. Agree or disagree?

Chetan Patwardhan: Agree 100%. Let's look at the value creation process: data is nothing but innumerable floating points of reference. The gathering of data is the very first step. Creating useful information out of data is truly a daunting task because it's not based on the complexity of data, but the simplicity of information that leads to the creation of knowledge. For the CEO of a large company, a dozen key information sets presented in up/down or chart format can create knowledge of how the company is performing. Knowledge leads to a change in thinking, sometimes creating paradigm shifts in how companies approach challenges. Changes in thinking bring about decision making and changes in the behavior of the organization. This chain reaction finally leads to success.

One key here lies in the ability of the information system to let consumers of data effortlessly traverse from data sets to concise, simple information and vice versa. For example, if a simple graph shows the market share of a product worldwide, that's great information. Then, there should be the ability to click on that graph and keep drilling down to continents, regions, countries, states, cities, and finally stores. In other words, neither the data by itself nor the one way extraction of information can be answers in themselves without this ability to traverse back and forth, pivot, report, represent, and share with the ease of point and click.

Finally, let's revise the chain reaction: collection of good data leads to meaningful information. Information leads to knowledge, which in turn leads to changes in behavior and critical decision making. Not just the success, but the survival of enterprises more ever than before will be dictated by their ability to collect and covert data into precisely timed good decision making!

Cloud Computing Journal: Forrester's James Staten: "Not everything will move to the cloud as there are many business processes, data sets and workflows that require specific hardware or proprietary solutions that can't take advantage of cloud economics. For this reason we'll likely still have mainframes 20 years from now." Agree or disagree?

Patwardhan: Well, define mainframe and cloud. I thought they were synonymous :). Before the concept of cloud, the mainframe was the cloud. Back in the day, we connected to that cloud via the so-called dumb terminals. For those old enough to have used IBM PROFS messaging, it was the first instant email and instant messenger system in one. And it worked really well! Well, the limitation of the cloud and mainframe then is the same.

The basic premise for any central computing system optimized for mass consumption is the 80/20 rule. It can be built only to serve 80% of the needs in an economized and optimized fashion. Having said that, the so-called cloud economics work only for a certain type of system and is outright prohibitively expensive for most enterprise setups where a typical three-year timeframe cost view is more dependent on human labor than on the infrastructure.

Now, to the point, can cloud replace, say, 99% of all conventional computing. Certainly not any time in the near future. There are several reasons for this. First, let's admit that most applications were never designed and, as a matter of fact, can't be designed from scratch to run on the cloud. Why? Because fundamentally, there is no standardized, here-to-stay cloud infrastructure that enterprise applications can be written to. Second, as someone who has installed and managed systems for enterprises from startups to Fortune 500 companies, I can tell you that no two sets of information systems look alike, let alone 80% of them. Third, many enterprises need a level of security, customization, and back-end connections (XML, EDI, VPN) that can't be in the cloud without the cloud looking the same as the conventional system. Fourth, there is little transparency and answerability in the cloud when it comes to the ability to audit and maintain compliance levels. And last but not least, if the cloud finally maintains almost the same burden (minus keeping up the physical servers) on human engineers, where are the economies to be had?

From my perspective, consolidation of human resource talent pools, combined with the ability of leveraging the most economical cloud options (IAAS, PAAS and SAAS) as well as the conventional datacenter setups - essentially a super-hybrid approach - will be the way to go.

Cloud Computing Journal: The price of cloud computing will go up - so will the demand. Agree or disagree or....?

Patwardhan: I don't understand why the price of cloud computing will go up. I expect it to remain flat over the next few years. While the efficiencies in hardware will reduce the price and/or increase the processing capabilities, the overhead of maintaining availability and pressure to provide human interface will also increase. As a result, the prices will probably remain flat. As for the demand, it will increase, but one must first factor in the overall demand for computing, which is constantly on the rise. Since the cloud is one way to satiate that demand, cloud subscriptions will rise too. Of the three types of cloud, to me PaaS and SaaS should generate more demand than the pure IaaS, because they both address the problem of cost of human labor. As long as the PaaS and SaaS providers get it right in terms of addressing user needs, demand for those services should rise.

Cloud Computing Journal: Rackspace is reporting an 80% growth from cloud computing, Amazon continues to innovate and make great strides, and Microsoft, Dell and other big players are positioning themselves as big leaders. Are you expecting in the next 18 months to see the bottom fall out and scores of cloud providers failing or getting gobbled up by bigger players? Or what?

Patwardhan: The news of Rackspace reporting 80% growth in cloud computing needs a special lens for viewing! Rackspace's model from day one has been to lease hosted servers, networking gear, and storage. With their cloud solution, they are, hmmm, leasing hosted servers, networking gear and storage! Essentially, the only change in their offering (pre-cloud and cloud) is flexibility and elasticity. It's important to take into account the trajectory of their overall demand, then contrast that against how much of that demand was served by their conventional model versus their cloud model. The cloud model for Rackspace customers is nothing but a little cheaper way to get started.

As for the small and large companies setting up infrastructure farms, adding their differentiated layers of service, it's a phenomenon not unique to the cloud. It happens every time a new bandwagon arrives. For now, it seems that a variety of local, regional and national players are thriving. Again, this is a common phenomenon in any cycle.

What does the landscape look like five years from now? There are three big factors unique to cloud providers. First, it takes real infrastructure (datacenter, equipment) to create a cloud service. Second, infrastructure ages and becomes obsolete quickly in this industry. Third, smaller companies once past the first or second installation will struggle because either they will stagnate and die from attrition (easy in cloud) or from cash flow challenges if they find a way to grow.

Wait a minute, is that familiar to you? It's after all not that unique is it? All infra companies, from telecom to trucking, suffer the same fate. Ergo, some will die and for some customers their cloud will disappear with little notice. Yet some others will find bigger fish that will gobble them up at low prices. It's unlikely for an IaaS cloud provider to be bought by a big player for a handsome price unless they have great momentum, brand, and profitability. I don't expect dramatic events in the next 18 months, but do expect the law of the jungle to prevail over the next five years.

Cloud Computing Journal: Please name one thing that - despite what we all may have heard or read - you are certain is not going to happen in the future, with Cloud and BigData? ;-)

Patwardhan: I find this question amusing because it tempts me to put things in here like a telco or a bank will never use the cloud to provide their core telecom or banking service. I would have preferred to answer a question that mentioned a few things that are possible candidates for the cloud, but in my opinion will not happen. Let me leave this thought behind - if there are things that you think will never happen in the cloud, think again. It is a matter of time before the evolution of secure virtualized and orchestrated platforms, ingenuity of service providers, and the lack of qualified human engineers will move things to the cloud in a manner we are not willing to think about today.

More Stories By Jeremy Geelan

Jeremy Geelan is Chairman & CEO of the 21st Century Internet Group, Inc. and an Executive Academy Member of the International Academy of Digital Arts & Sciences. Formerly he was President & COO at Cloud Expo, Inc. and Conference Chair of the worldwide Cloud Expo series. He appears regularly at conferences and trade shows, speaking to technology audiences across six continents. You can follow him on twitter: @jg21.

Comments (0)

Share your thoughts on this story.

Add your comment
You must be signed in to add a comment. Sign-in | Register

In accordance with our Comment Policy, we encourage comments that are on topic, relevant and to-the-point. We will remove comments that include profanity, personal attacks, racial slurs, threats of violence, or other inappropriate material that violates our Terms and Conditions, and will block users who make repeated violations. We ask all readers to expect diversity of opinion and to treat one another with dignity and respect.


@CloudExpo Stories
VictorOps is making on-call suck less with the only collaborative alert management platform on the market. With easy on-call scheduling management, a real-time incident timeline that gives you contextual relevance around your alerts and powerful reporting features that make post-mortems more effective, VictorOps helps your IT/DevOps team solve problems faster.
Skeuomorphism usually means retaining existing design cues in something new that doesn’t actually need them. However, the concept of skeuomorphism can be thought of as relating more broadly to applying existing patterns to new technologies that, in fact, cry out for new approaches. In his session at DevOps Summit, Gordon Haff, Senior Cloud Strategy Marketing and Evangelism Manager at Red Hat, will discuss why containers should be paired with new architectural practices such as microservices ra...
Roberto Medrano, Executive Vice President at SOA Software, had reached 30,000 page views on his home page - http://RobertoMedrano.SYS-CON.com/ - on the SYS-CON family of online magazines, which includes Cloud Computing Journal, Internet of Things Journal, Big Data Journal, and SOA World Magazine. He is a recognized executive in the information technology fields of SOA, internet security, governance, and compliance. He has extensive experience with both start-ups and large companies, having been ...
The industrial software market has treated data with the mentality of “collect everything now, worry about how to use it later.” We now find ourselves buried in data, with the pervasive connectivity of the (Industrial) Internet of Things only piling on more numbers. There’s too much data and not enough information. In his session at @ThingsExpo, Bob Gates, Global Marketing Director, GE’s Intelligent Platforms business, to discuss how realizing the power of IoT, software developers are now focu...
Operational Hadoop and the Lambda Architecture for Streaming Data Apache Hadoop is emerging as a distributed platform for handling large and fast incoming streams of data. Predictive maintenance, supply chain optimization, and Internet-of-Things analysis are examples where Hadoop provides the scalable storage, processing, and analytics platform to gain meaningful insights from granular data that is typically only valuable from a large-scale, aggregate view. One architecture useful for capturing...
SYS-CON Events announced today that Vitria Technology, Inc. will exhibit at SYS-CON’s @ThingsExpo, which will take place on June 9-11, 2015, at the Javits Center in New York City, NY. Vitria will showcase the company’s new IoT Analytics Platform through live demonstrations at booth #330. Vitria’s IoT Analytics Platform, fully integrated and powered by an operational intelligence engine, enables customers to rapidly build and operationalize advanced analytics to deliver timely business outcomes ...
DevOps is about increasing efficiency, but nothing is more inefficient than building the same application twice. However, this is a routine occurrence with enterprise applications that need both a rich desktop web interface and strong mobile support. With recent technological advances from Isomorphic Software and others, it is now feasible to create a rich desktop and tuned mobile experience with a single codebase, without compromising performance or usability.
SYS-CON Events announced today Arista Networks will exhibit at SYS-CON's DevOps Summit 2015 New York, which will take place June 9-11, 2015, at the Javits Center in New York City, NY. Arista Networks was founded to deliver software-driven cloud networking solutions for large data center and computing environments. Arista’s award-winning 10/40/100GbE switches redefine scalability, robustness, and price-performance, with over 3,000 customers and more than three million cloud networking ports depl...
The speed of software changes in growing and large scale rapid-paced DevOps environments presents a challenge for continuous testing. Many organizations struggle to get this right. Practices that work for small scale continuous testing may not be sufficient as the requirements grow. In his session at DevOps Summit, Marc Hornbeek, Sr. Solutions Architect of DevOps continuous test solutions at Spirent Communications, will explain the best practices of continuous testing at high scale, which is r...
SYS-CON Events announced today that Open Data Centers (ODC), a carrier-neutral colocation provider, will exhibit at SYS-CON's 16th International Cloud Expo®, which will take place June 9-11, 2015, at the Javits Center in New York City, NY. Open Data Centers is a carrier-neutral data center operator in New Jersey and New York City offering alternative connectivity options for carriers, service providers and enterprise customers.
Thanks to Docker, it becomes very easy to leverage containers to build, ship, and run any Linux application on any kind of infrastructure. Docker is particularly helpful for microservice architectures because their successful implementation relies on a fast, efficient deployment mechanism – which is precisely one of the features of Docker. Microservice architectures are therefore becoming more popular, and are increasingly seen as an interesting option even for smaller projects, instead of bein...
Security can create serious friction for DevOps processes. We've come up with an approach to alleviate the friction and provide security value to DevOps teams. In her session at DevOps Summit, Shannon Lietz, Senior Manager of DevSecOps at Intuit, will discuss how DevSecOps got started and how it has evolved. Shannon Lietz has over two decades of experience pursuing next generation security solutions. She is currently the DevSecOps Leader for Intuit where she is responsible for setting and driv...
The explosion of connected devices / sensors is creating an ever-expanding set of new and valuable data. In parallel the emerging capability of Big Data technologies to store, access, analyze, and react to this data is producing changes in business models under the umbrella of the Internet of Things (IoT). In particular within the Insurance industry, IoT appears positioned to enable deep changes by altering relationships between insurers, distributors, and the insured. In his session at @Things...
In his session at DevOps Summit, Tapabrata Pal, Director of Enterprise Architecture at Capital One, will tell a story about how Capital One has embraced Agile and DevOps Security practices across the Enterprise – driven by Enterprise Architecture; bringing in Development, Operations and Information Security organizations together. Capital Ones DevOpsSec practice is based upon three "pillars" – Shift-Left, Automate Everything, Dashboard Everything. Within about three years, from 100% waterfall, C...
Even as cloud and managed services grow increasingly central to business strategy and performance, challenges remain. The biggest sticking point for companies seeking to capitalize on the cloud is data security. Keeping data safe is an issue in any computing environment, and it has been a focus since the earliest days of the cloud revolution. Understandably so: a lot can go wrong when you allow valuable information to live outside the firewall. Recent revelations about government snooping, along...
PubNub on Monday has announced that it is partnering with IBM to bring its sophisticated real-time data streaming and messaging capabilities to Bluemix, IBM’s cloud development platform. “Today’s app and connected devices require an always-on connection, but building a secure, scalable solution from the ground up is time consuming, resource intensive, and error-prone,” said Todd Greene, CEO of PubNub. “PubNub enables web, mobile and IoT developers building apps on IBM Bluemix to quickly add sc...
Data-intensive companies that strive to gain insights from data using Big Data analytics tools can gain tremendous competitive advantage by deploying data-centric storage. Organizations generate large volumes of data, the vast majority of which is unstructured. As the volume and velocity of this unstructured data increases, the costs, risks and usability challenges associated with managing the unstructured data (regardless of file type, size or device) increases simultaneously, including end-to-...
The excitement around the possibilities enabled by Big Data is being tempered by the daunting task of feeding the analytics engines with high quality data on a continuous basis. As the once distinct fields of data integration and data management increasingly converge, cloud-based data solutions providers have emerged that can buffer your organization from the complexities of this continuous data cleansing and management so that you’re free to focus on the end goal: actionable insight.
Between the compelling mockups and specs produced by your analysts and designers, and the resulting application built by your developers, there is a gulf where projects fail, costs spiral out of control, and applications fall short of requirements. In his session at DevOps Summit, Charles Kendrick, CTO and Chief Architect at Isomorphic Software, will present a new approach where business and development users collaborate – each using tools appropriate to their goals and expertise – to build mo...
The Internet of Things (IoT) is causing data centers to become radically decentralized and atomized within a new paradigm known as “fog computing.” To support IoT applications, such as connected cars and smart grids, data centers' core functions will be decentralized out to the network's edges and endpoints (aka “fogs”). As this trend takes hold, Big Data analytics platforms will focus on high-volume log analysis (aka “logs”) and rely heavily on cognitive-computing algorithms (aka “cogs”) to mak...