|By Gathering Clouds||
|April 17, 2013 10:04 AM EDT||
We recently had the pleasure of having an extended conversation with Dana Gardner, president and principal analyst at Interarbor Solutions. Our discussion covered enterprise cloud adoption, C-suite IT strategies, the changing cloud market, and much more. This is the second of two parts; check out Part 1.
Gathering Clouds: It seems to be the trend overall that cloud is the way to go. But there are significant hurdles for organizations to overcome.
Dana Gardner: I think the biggest category for that is just complexity. Complexity in terms of how do you deal with so many points of change in order to access the benefits of cloud? How do you transition without dropping the ball? How do you keep all of your business services available when you go from on premises IT strategy to a cloud or a hybrid model? How do you switch off one system and on another over a weekend without disrupting services and losing money? The complexity of adoption, I think, is the biggest hurdle that people resist that because it’s an unknown, it’s risky. And it requires a lot of organizational fortitude and capability.
You have to really be a very good company on both the business and technology sides, as well as being able to martial people to change and adopt systems and different processes to conduct business and procure applications in different ways. So I guess change management and complexity are probably the two biggest issues that will maybe hold people back from adopting cloud.
But I do think that the vision after the transition should be the guiding principle. Even though it’s going to be a tough, messy, risk-intensive kind of a transition, when you can move towards a better IT paradigm, better models, when you can focus on just the applications that are core, when you can outsource commodity technology and make all the data and the applications available to the most people for the least amount of hassle, all of those are really important opportunities.
Companies begin thinking about this notion of a fabric approach to infrastructure. Then, when they see that that’s a good thing and they’ve made some progress, they can start to think about what application should be best suited for which models and whether an outsourced, either colocation or managed service provider approach or a public cloud is best. Companies will start to develop the skills, the capabilities, extending the level of immersion in cloud a little bit further each step. And before you know it, they’re into full-fledged hybrid cloud computing.
GC: What is the opportunity in this very dynamic market for managed service providers (MSPs) vs. big mega clouds versus “one off” systems providers? Is it really going to boil down to cost, or are there other reasons that a large company would want to go MSP route versus the DIY Amazon route versus some virtualization technology from VMware?
DG: Yeah, I think we’re going to see a really rich ecosystem of cloud providers, or managed service providers, that start to come to the rescue when it comes to these issues about complexity and relevance to a particular industry or geography or country or regulatory climate. And just as we saw with mainframes, client server, distributed computing, and services-oriented architecture, MSPs are going to become integrators — middlemen organizations that fill a very large and necessary role.
If you think about a food chain, Amazon is providing the plankton so to speak — a base set of capabilities that others consume. But then they add value to that and provide services to yet another layer. You can really see already that they are building ecosystem of providers. And just because it’s a public cloud with baseline capabilities that engineers are more comfortable with than business people for sure, I think we’re going to see more and more specialization based on value-added services, complexity, security, relevance to geography, compliance, for specific industry verticals.
And some providers are going to focus on making cloud providers look invisible to the applications providers, software-as-a-service (SaaS) providers, for example. And then there are going to be other providers that will deal with managing and integrating the data, and there’ll be other providers that will be there for extending cloud services to end-customers. So I doubt that any provider is going to be a one-stop shop, Providers will be increasingly specialized.
GC: Where do you think MSPs are going to have the most impact?
DG: MSPs should focus on certain application sets, and they should look at helping their existing customers transition to this new vision of IT that we’ve been talking about. So stick to your existing install base of clients, listen to them, and understand their pain. They should also move towards a value-added supply chain concept, where providers will be able to cut costs by leveraging cloud services from third parties and then pass along some of those savings to clients. Meanwhile, these providers will be adding value and extending and innovating around these services to solve more problems around the borderless enterprise, for example, or extended supply chains or new ways to market in the B2C play, taking advantage of more data and analytics in real time.
The key for providers is to achieve specialization around business services, sticking close to the needs of your existing clients, and then cutting your own costs by being a good exploiter of cloud values yourself. Another key point is to know your business as a provider. You might not be the best at building data centers, but you might be great at exploiting them once they’re built. So getting out of the facilities business perhaps but into the technology value add-as-a-service that you then extend into your particular markets, it also, you know, makes sense to start thinking about new markets that would be good fits. It could be saying, “Well, if we can do this in North America, we can do it in South America,” or “If we can understand the regulatory compliance issues in the healthcare industry, we can probably extend it into life sciences or maybe even education,” so, you know, those are the kind of strategic business decisions that MSPs should be thinking about. But I think that there’s an awful lot of opportunity out there.
GC: Looking out at the space, what do you think is missing in terms of the overall products being established, businesses being run, models being used?
DG: Well, we need to see more examples of how this works, so there needs to be less I guess custom and innovation for “one offs” and more of a standard operating procedure approach to how this could be done, so, you know, to some of your questions earlier, how to get started, it’s still complex, it’s still mysterious. Heck, some companies don’t even know how to define cloud computing very well yet.
So, somebody can do well by coming up with the methodologies, standard operating procedures of how to enter new models for IT and services procurement but without it being unique each time, something that’s repeatable, understood. So we need maturity, I guess that’s the best word to describe what I’m getting at, we need to see more maturity. And examples of success that are understood and verified. And then that’s going to help the market advance more rapidly and for business to become regular and growing and dependable rather than sort of a difficult sell each and every time but a little bit more from a custom development to more of a standard procurement type of an affair.
GC: And then, sort of extending this question a little bit, when is it not correct to view the cloud as the go-to solution?
DG: Well, just a standard cost-benefit analysis, if you’re not getting any more value or innovation or services improvement for any less money, then why would you do it? You know, the accountants and the chief operating officers, they’re good at that, they’ll know pretty quickly whether something is in their best interest or not. And there has to be a certain leap of faith, I suppose, but, you know, the other nice thing about cloud is that you can do it in a crawl, walk, run approach, you can bite off one or two applications or take one or two parts of a data center and start to, you know, hybridize that in terms of where you source it. Doesn’t have to be all or nothing.
And then you can start to look at what the cost benefits and the analysis are. And what the risks truly are. And make a rational, lucid business decision about how further into cloud to go, and it’s not irreversible, I mean, there’s no reason why, if you put apps and data in a cloud, that you can’t say, “Listen, that didn’t work out. Let’s bring it back into our own data center,” you should have the ability to do that. And you’d certainly want to write that in your SLAs, you own the data, you own the apps, it’s your logic. And ultimately, it shouldn’t take too long to decide whether it’s a good move or not.
GC: Looking forward, what do you foresee as some of the major changes coming down in the next few years that are really going to impact the way this space operates and works?
DG: Well, I think the whole notion of data-driven business is not something to underestimate. More and more companies are getting the right information that they need to make scientific decisions rather than guestimates or estimates or even gut calls. And we’re able to drive more data from more types of activities, devices, external activities, whether it’s location and geography, whether it’s use patterns. More and more data sets are going to be available perhaps at a price but nonetheless available for you to be able to analyze and really know what to do in business.
And, so, that fundamentally changes things, when there’s less of a guesstimate, you don’t have to be a car company in the 1960s and say, “We’re going to spend three billion dollars tooling up a factory to build a car called the Rambler.” Or even 10 years earlier, the Edsel. “And, you know, we’re basically gonna make a four billion-dollar bet that this thing’s gonna sell.” Those days are over, you’re not going to be having to make that kind of guess, or you’re not going to hit as many failures because you’re going to know what the market’s demanding, how to get into the market, how to adapt it in the market, how to do the support, you know, make your customers happy. Reach them and their mobile devices at a location, it really changes business when it’s data-driven.
And so IT becomes, therefore, much more important to the integrity of the business. And then IT can make the decisions about how to provide those services best. So to the future, I see a lot more of the mobile crazy continuing, where people are always on, always providing data, always getting data. Consumers would make better decisions. Businesses will make better decisions. A higher level of efficiency. Less guesswork, rapid improvement cycles in terms of adoption and refinement. And, so, yeah, a very good time for business to be efficient and to find markets and innovate effectively. And a really great time for IT to enable them to do that.
GC: A lot of the times when we’re talking about cloud, there is this interesting overlap that occurs between cloud and big data. In your view, why are the two spoken of so closely now?
DG: Well, it’s kind of just basic logistics of the data. Before, when you had a traditional data environment, you would be generating data from applications. You would cleanse and warehouse that data. You would deliver it over in a batch format for some kind the analytics applications to then create reports. This could take months, perhaps being completed once a quarter perhaps. And it was difficult, it was expensive, there were large data sets to manage. But what was probably the biggest difference is that the data was sitting off somewhere on its own.
In moving to the cloud, and your cloud provider is the data source for many of the players in a ecosystem, in a business vertical and a supply chain. And if a company gives permission for that data to be shared or joined or analyzed in some way, then that data will be sitting in the same cloud, making it more easily leveraged or exploited than if each company had its own warehouse with different technologies, with different analytic formats or analytic applications.
So, to me, combing the cloud with big data and analytics just really juices up the opportunity for more data to be joined in a common fabric for the betterment of understanding and deriving value. In working with partners and combining data in ways that hadn’t been done before, through networking software-defined data centers, common structures for data like NoSQL, NewSQL and Hadoop and MapReduce, means that the ease of finding commonality and value between data sets and then exploiting that, in near real time is more real than ever.
For companies like Apple, when it comes to their supply chain of, if a customer orders an iPad, , it can go from China to a customer in Kansas City — these kinds of efficiencies are driven by big data with very tight loops of analysis and feedback. Companies then reapply that analysis for constant iterative improvement on a massive scale at a managed cost. In my view, that is the right equation, and we’re beginning to see more companies take that approach. Also, that benefit and those tools are no longer only available to companies like Southwest and Apple. Anybody that’s savvy enough to acquire business services and data and analytic services can create value. The whole notion of analytics and data-as-a-service becomes much more prominent and possible, and therefore, more and more people are able to get more data about more activities, and then applying it productively. It’s quite actually an amazing concept.
GC: Who are some of the thought leaders you’re following in this space?
DG: Not too long ago, I used to have certain marquee publications that I would go to. And there would be certain writers that I would look to and say, “I know them, I’ve met them, I trust them, I read their stuff.” And I still do that to an extent, but more and more, the way in which social and aggregator sites and this feedback loop that we’ve been talking about in data and analytics is being applied, the stories kind of find me more than I find them now.
So when you use something like Flipboard, or you look at even Google News and Yahoo! News, if you’re signed in, they’re looking at what you look at, and they know more about you based on your profile and your metadata and your activities online and then your social graph does. Suddenly, the algorithms and the content aggregators are better able to point content to me rather than my actually having to go out and pull it or discover it myself.
So more and more, I’m finding the right information based on letting the cloud services know what I’m looking for, in essence a profile of my needs. I’ll see stories from GigaOM and Barb Darrow, and I’ll read Dave Linthicum’s blog on InfoWorld or Charlie Babcock in InformationWeek.
I like this notion of the best and the brightest coming to me on a “one off” basis based on my needs and profile. I think it’s really powerful. When I go and do a search on a news topic a lot of times what the algorithm delivers to me is pretty good!
And so more and more, I don’t have to just go to a marquee site and read the list of headlines there; the list of headlines follows me based on what I need to digest based on my activities online. And again, it’s a constantly iterative process, so in a way it’s a parallel in a microcosm to what cloud computing can be for businesses.
GC: Do you think that this emphasis on social, has, in a way, impacted how we understand the development of the cloud space?
DG: Yeah. It’s a little bit uncharted territory, and it can be even a little scary at times, but I think if you’re authentic with what you do as a social entity online, and you really do go after what’s of interest to you, you’ll be rewarded by getting more back in terms of what is most relevant to you.
I eat, sleep, and drink cloud computing, and all the metadata that’s being carved out around me online is rewarding me in some way by funneling the best and brightest information about cloud computing. This makes me a better educated person, so, it is an ongoing feedback loop that’s still kind of new. So far, I think the results are actually quite good.
By Jake Gardner
"Qosmos has launched L7Viewer, a network traffic analysis tool, so it analyzes all the traffic between the virtual machine and the data center and the virtual machine and the external world," stated Sebastien Synold, Product Line Manager at Qosmos, in this SYS-CON.tv interview at 19th Cloud Expo, held November 1-3, 2016, at the Santa Clara Convention Center in Santa Clara, CA.
Dec. 10, 2016 09:30 AM EST Reads: 1,074
Internet-of-Things discussions can end up either going down the consumer gadget rabbit hole or focused on the sort of data logging that industrial manufacturers have been doing forever. However, in fact, companies today are already using IoT data both to optimize their operational technology and to improve the experience of customer interactions in novel ways. In his session at @ThingsExpo, Gordon Haff, Red Hat Technology Evangelist, will share examples from a wide range of industries – includin...
Dec. 10, 2016 09:00 AM EST Reads: 1,774
The WebRTC Summit New York, to be held June 6-8, 2017, at the Javits Center in New York City, NY, announces that its Call for Papers is now open. Topics include all aspects of improving IT delivery by eliminating waste through automated business models leveraging cloud technologies. WebRTC Summit is co-located with 20th International Cloud Expo and @ThingsExpo. WebRTC is the future of browser-to-browser communications, and continues to make inroads into the traditional, difficult, plug-in web co...
Dec. 10, 2016 08:30 AM EST Reads: 1,505
"We build IoT infrastructure products - when you have to integrate different devices, different systems and cloud you have to build an application to do that but we eliminate the need to build an application. Our products can integrate any device, any system, any cloud regardless of protocol," explained Peter Jung, Chief Product Officer at Pulzze Systems, in this SYS-CON.tv interview at @ThingsExpo, held November 1-3, 2016, at the Santa Clara Convention Center in Santa Clara, CA.
Dec. 10, 2016 08:15 AM EST Reads: 1,255
"We analyze the video streaming experience. We are gathering the user behavior in real time from the user devices and we analyze how users experience the video streaming," explained Eric Kim, Founder and CEO at Streamlyzer, in this SYS-CON.tv interview at 19th Cloud Expo, held November 1-3, 2016, at the Santa Clara Convention Center in Santa Clara, CA.
Dec. 10, 2016 07:30 AM EST Reads: 773
"This is specifically designed to accommodate some of the needs for high availability and failover in a network managed system for the major Korean corporations," stated Thomas Masters, Managing Director at InfranicsUSA, in this SYS-CON.tv interview at 19th Cloud Expo, held November 1-3, 2016, at the Santa Clara Convention Center in Santa Clara, CA.
Dec. 10, 2016 06:45 AM EST Reads: 548
Enterprise IT has been in the era of Hybrid Cloud for some time now. But it seems most conversations about Hybrid are focused on integrating AWS, Microsoft Azure, or Google ECM into existing on-premises systems. Where is all the Private Cloud? What do technology providers need to do to make their offerings more compelling? How should enterprise IT executives and buyers define their focus, needs, and roadmap, and communicate that clearly to the providers?
Dec. 10, 2016 06:00 AM EST Reads: 715
"We are an all-flash array storage provider but our focus has been on VM-aware storage specifically for virtualized applications," stated Dhiraj Sehgal of Tintri in this SYS-CON.tv interview at 19th Cloud Expo, held November 1-3, 2016, at the Santa Clara Convention Center in Santa Clara, CA.
Dec. 10, 2016 04:45 AM EST Reads: 1,104
"We are a leader in the market space called network visibility solutions - it enables monitoring tools and Big Data analysis to access the data and be able to see the performance," explained Shay Morag, VP of Sales and Marketing at Niagara Networks, in this SYS-CON.tv interview at 19th Cloud Expo, held November 1-3, 2016, at the Santa Clara Convention Center in Santa Clara, CA.
Dec. 10, 2016 04:30 AM EST Reads: 558
According to Forrester Research, every business will become either a digital predator or digital prey by 2020. To avoid demise, organizations must rapidly create new sources of value in their end-to-end customer experiences. True digital predators also must break down information and process silos and extend digital transformation initiatives to empower employees with the digital resources needed to win, serve, and retain customers.
Dec. 10, 2016 04:15 AM EST Reads: 1,392
Amazon has gradually rolled out parts of its IoT offerings in the last year, but these are just the tip of the iceberg. In addition to optimizing their back-end AWS offerings, Amazon is laying the ground work to be a major force in IoT – especially in the connected home and office. Amazon is extending its reach by building on its dominant Cloud IoT platform, its Dash Button strategy, recently announced Replenishment Services, the Echo/Alexa voice recognition control platform, the 6-7 strategic...
Dec. 10, 2016 04:15 AM EST Reads: 585
Organizations planning enterprise data center consolidation and modernization projects are faced with a challenging, costly reality. Requirements to deploy modern, cloud-native applications simultaneously with traditional client/server applications are almost impossible to achieve with hardware-centric enterprise infrastructure. Compute and network infrastructure are fast moving down a software-defined path, but storage has been a laggard. Until now.
Dec. 10, 2016 04:00 AM EST Reads: 5,530
We're entering the post-smartphone era, where wearable gadgets from watches and fitness bands to glasses and health aids will power the next technological revolution. With mass adoption of wearable devices comes a new data ecosystem that must be protected. Wearables open new pathways that facilitate the tracking, sharing and storing of consumers’ personal health, location and daily activity data. Consumers have some idea of the data these devices capture, but most don’t realize how revealing and...
Dec. 10, 2016 04:00 AM EST Reads: 5,320
Get deep visibility into the performance of your databases and expert advice for performance optimization and tuning. You can't get application performance without database performance. Give everyone on the team a comprehensive view of how every aspect of the system affects performance across SQL database operations, host server and OS, virtualization resources and storage I/O. Quickly find bottlenecks and troubleshoot complex problems.
Dec. 10, 2016 02:45 AM EST Reads: 2,272
Whether your IoT service is connecting cars, homes, appliances, wearable, cameras or other devices, one question hangs in the balance – how do you actually make money from this service? The ability to turn your IoT service into profit requires the ability to create a monetization strategy that is flexible, scalable and working for you in real-time. It must be a transparent, smoothly implemented strategy that all stakeholders – from customers to the board – will be able to understand and comprehe...
Dec. 10, 2016 02:15 AM EST Reads: 809
Complete Internet of Things (IoT) embedded device security is not just about the device but involves the entire product’s identity, data and control integrity, and services traversing the cloud. A device can no longer be looked at as an island; it is a part of a system. In fact, given the cross-domain interactions enabled by IoT it could be a part of many systems. Also, depending on where the device is deployed, for example, in the office building versus a factory floor or oil field, security ha...
Dec. 10, 2016 02:00 AM EST Reads: 666
Between 2005 and 2020, data volumes will grow by a factor of 300 – enough data to stack CDs from the earth to the moon 162 times. This has come to be known as the ‘big data’ phenomenon. Unfortunately, traditional approaches to handling, storing and analyzing data aren’t adequate at this scale: they’re too costly, slow and physically cumbersome to keep up. Fortunately, in response a new breed of technology has emerged that is cheaper, faster and more scalable. Yet, in meeting these new needs they...
Dec. 10, 2016 02:00 AM EST Reads: 1,997
When it comes to cloud computing, the ability to turn massive amounts of compute cores on and off on demand sounds attractive to IT staff, who need to manage peaks and valleys in user activity. With cloud bursting, the majority of the data can stay on premises while tapping into compute from public cloud providers, reducing risk and minimizing need to move large files. In his session at 18th Cloud Expo, Scott Jeschonek, Director of Product Management at Avere Systems, discussed the IT and busin...
Dec. 10, 2016 01:30 AM EST Reads: 4,006
An IoT product’s log files speak volumes about what’s happening with your products in the field, pinpointing current and potential issues, and enabling you to predict failures and save millions of dollars in inventory. But until recently, no one knew how to listen. In his session at @ThingsExpo, Dan Gettens, Chief Research Officer at OnProcess, discussed recent research by Massachusetts Institute of Technology and OnProcess Technology, where MIT created a new, breakthrough analytics model for s...
Dec. 10, 2016 01:30 AM EST Reads: 804
"We are the public cloud providers. We are currently providing 50% of the resources they need for doing e-commerce business in China and we are hosting about 60% of mobile gaming in China," explained Yi Zheng, CPO and VP of Engineering at CDS Global Cloud, in this SYS-CON.tv interview at 19th Cloud Expo, held November 1-3, 2016, at the Santa Clara Convention Center in Santa Clara, CA.
Dec. 10, 2016 01:15 AM EST Reads: 1,229