|June 6, 2010 08:15 AM EDT||
The cloud essentially "consumerizes" all of IT, not just relatively unimportant bits like procuring personal hard- and software. This requires a whole rethinking of corporate IT, as the idea of any master design becomes unattainable. How can IT as a species survive this trend as it may render the education of a whole generation of IT-ers irrelevant? On the brighter side - it really caters for the talents of today's teenagers: consumption as a lifestyle.
The idea of consumerization - users being allowed to freely procure their own personal hard- and software - has been around for a while. But few CIOs and even less heads of IT Operations have embraced it. Other than some token adoption, where users could choose between an iPhone or a Blackberry or where users got a personal budget to order from the company supplied catalog of pre-approved hardware, we see little adoption of the concept. The idea is that users can go to any consumer store or webshop and order any gadget they like, be it an iPad, laptops, printer or smart phone and configure these basically while still in the store to access their corporate mail, intranet and company applications. The idea originated when people wanted to use their 24 inch HD pc with 4 processors and mega memory - all essential to enjoy modern home entertainment and video and far superior to company standard issue equipment- to also do some work.
Cloud computing now makes such a consumer approach also possible at the departmental level. Department selecting and using non corporate approved or endorsed SaaS based CRM applications are the most commonly used example. But more interesting are the cases where departments - tired of waiting for their turn in the never reducing application backlog of corporate IT - turned to a system integrator to build a custom cloud application to meet their immediate needs. Several system integrators indicate that they have more and more projects where no longer IT, but the business department, is their prime customer. Contracts, SLA's and even integrations are negotiated directly between the SI and the business department, in some cases IT is not even involved or aware.
So back to consumerisation. Although the trend has been far from whole heartily embraced by most corporate IT, it is continuing. In my direct environment I see several people who, instead of plugging their laptop into the corporate network at the office, take a 3G network stick to work. For around 20 Euros a month this gives them better performance accessing the applications they care about, not to mention it gives them access to applications most corporate IT department do not care for, like facebook, twitter, etc. Question is off course, can they do their work like that? Don't they need all day, full time access to the aforementioned fully vertically integrated ERP system? The answer is No. First of all, the vertically integrated type of enterprise that ERP was intended for, no longer exist. Most corporations have taken to outsourcing distribution to DHL or TNT, employee travel to the likes of American Express, HR payroll and expenses to XYZ, etc. etc. The list goes on and on.
All these external service providers support these services with web based systems that can be accessed from anywhere, inside and outside the company firewall. At the same time, the remaining processes that occur in the corporate ERP system are so integrated that they hardly require any manual intervention from employees. Consequently employees don't need to spend their time doing data entry or even data updates or analysis on that system. Any remaining required interaction is facilitated by directly interfacing with the customer via the web shop or via other web based systems. One could say that the world moved from vertically integrated manufacturing corporations to supply chain connected extended enterprises.
The question I will address in my next post is how does the cloud enabling consumerisation for enterprise applications play a role in this and what this means for IT moving forward.
On the supply side of IT, it means applications are best delivered as easily consumerable services to employees and others (partners, customers, suppliers). One large European multinational is already delivering all their new applications as internet (so not intranet) applications. Meaning any application can be accessed from anywhere by simply entering a URL and doing proper authentication. Choosing which applications to provide internally is based on whether there are outside parties willing and capable to provide these services or whether the company can gain a distinct advantage by providing the service themselves.
When speaking about consuming services, one should try and think broader than just IT services. The head of distribution may be looking for a parcel tracking system, but when asking the CEO or the COO they are more likely to think of a service in terms of something a DHL or TNT delivers. Services such as warehousing, distribution, but also complaint tracking, returns and repairs, or even accounting, marketing and reselling, all including the associate IT parts of those services. It is the idea of everything as a services, but on steroids (XaaSoS). Please note that even when an organization decides to provide one of these services internally, they can still source the underlying infrastructure and even applications "as a service" externally (this last scenario strangely enough is what many an IT person seems to think of exclusively when discussing cloud computing).
On the demand side of IT the issue is an altogether other one. How do we warrant continuity, efficiency and compliance, in such a consumption oriented IT World. If it is every man (or department) for themself, how do we prevent suboptimisation, In fact , how do we even know what is going on in the first place. How do we know what services are being consumed. This is the new challenge, and it is very similar to what companies faced when they decided to not manufacture everything themselves anymore, abandoning vertical integration where it made sense and taking a "supply chain" approach. Cloud computing is in many aspects a similar movement, and also here a supply chain approach looks like the way to go.
Such a supply chain approach means thoroughly understanding both demand and supply, matching the two and making sure that the goods - or in this case services - reach the right audience at the right time (on demand). IT has invested a fair amount of time and effort in better ways and methodologies to understand demand. On the supply side, IT till now assumed they were the supplier. In that role they used industry analysts to classify the components required, such as hardware and software. In this new world they need to start thoroughly understanding the full services that are available on the market. An interesting effort worth mentioning here is the SMI (Service Measurement Index) an approach to classify cloud services co-initiated by my employer, CA technologies and lead by Carnegie Mellon University.
After having gained an understanding of both demand and supply, the remaining task is "connecting the dots". This sounds trivial but is an activity that analysts estimate becoming a multi-billion industry within just a few years. It includes non-trivial tasks like identifying which users are allowed to do which tasks in this now open environment and optimizing the processes by picking resources that have the lowest utilization and thus cost. Because going forward scarcity will determine price especially in the new cloud world (which resembles Adam Smith's idea of a perfect open market a lot closer than any internal IT department ever did or will do).
Now off course all of the above won't happen overnight. Many a reader (and with a little luck the author) will have retired by the time today's vertically integrated systems - many of which are several decades old and based on solid, reliable mainframes - will have become services that are brokered in an open cloud market. A couple of high profile outages may even prolong this a generation or two more. But long term I see no other way. Other markets (electricity, electronics, publishing and even healthcare) have taken or are taking the same path. It is the era of consumption.
PS Short term, however, the thing we (IT) probably need most is a new diagraming technique. Why? From the above it will be clear that - in such a consumerised world - architecture diagrams are a thing of the past. And an IT person without a diagram is like a fish without water . We need something that allows us to evolve our IT fins into feet and our IT chews into lungs, so we can transition from water to land and not become extinct in the process. One essential aspect will be that unlike pictures of clouds and very much like real clouds, the diagrams will need to be able to change dynamically, much like pictures in a Harry Potter movie (it's magic). Who has a suggestion for such a technique ?
[Editorial note: This blog originally was published at ITSMportal.com on May 31st , 2010]
The buzz continues for cloud, data analytics and the Internet of Things (IoT) and their collective impact across all industries. But a new conversation is emerging - how do companies use industry disruption and technology enablers to lead in markets undergoing change, uncertainty and ambiguity? Organizations of all sizes need to evolve and transform, often under massive pressure, as industry lines blur and merge and traditional business models are assaulted and turned upside down. In this new da...
Oct. 8, 2015 09:30 PM EDT Reads: 259
Containers are revolutionizing the way we deploy and maintain our infrastructures, but monitoring and troubleshooting in a containerized environment can still be painful and impractical. Understanding even basic resource usage is difficult - let alone tracking network connections or malicious activity. In his session at DevOps Summit, Gianluca Borello, Sr. Software Engineer at Sysdig, will cover the current state of the art for container monitoring and visibility, including pros / cons and li...
Oct. 8, 2015 09:30 PM EDT Reads: 190
The Internet of Things (IoT) is growing rapidly by extending current technologies, products and networks. By 2020, Cisco estimates there will be 50 billion connected devices. Gartner has forecast revenues of over $300 billion, just to IoT suppliers. Now is the time to figure out how you’ll make money – not just create innovative products. With hundreds of new products and companies jumping into the IoT fray every month, there’s no shortage of innovation. Despite this, McKinsey/VisionMobile data...
Oct. 8, 2015 09:30 PM EDT Reads: 182
As-a-service models offer huge opportunities, but also complicate security. It may seem that the easiest way to migrate to a new architectural model is to let others, experts in their field, do the work. This has given rise to many as-a-service models throughout the industry and across the entire technology stack, from software to infrastructure. While this has unlocked huge opportunities to accelerate the deployment of new capabilities or increase economic efficiencies within an organization, i...
Oct. 8, 2015 09:30 PM EDT Reads: 239
There are so many tools and techniques for data analytics that even for a data scientist the choices, possible systems, and even the types of data can be daunting. In his session at @ThingsExpo, Chris Harrold, Global CTO for Big Data Solutions for EMC Corporation, will show how to perform a simple, but meaningful analysis of social sentiment data using freely available tools that take only minutes to download and install. Participants will get the download information, scripts, and complete en...
Oct. 8, 2015 09:15 PM EDT Reads: 261
Containers are changing the security landscape for software development and deployment. As with any security solutions, security approaches that work for developers, operations personnel and security professionals is a requirement. In his session at @DevOpsSummit, Kevin Gilpin, CTO and Co-Founder of Conjur, will discuss various security considerations for container-based infrastructure and related DevOps workflows.
Oct. 8, 2015 09:15 PM EDT Reads: 202
Manufacturing has widely adopted standardized and automated processes to create designs, build them, and maintain them through their life cycle. However, many modern manufacturing systems go beyond mechanized workflows to introduce empowered workers, flexible collaboration, and rapid iteration. Such behaviors also characterize open source software development and are at the heart of DevOps culture, processes, and tooling.
Oct. 8, 2015 09:00 PM EDT Reads: 1,081
Between the compelling mockups and specs produced by analysts, and resulting applications built by developers, there exists a gulf where projects fail, costs spiral, and applications disappoint. Methodologies like Agile attempt to address this with intensified communication, with partial success but many limitations. In his session at DevOps Summit, Charles Kendrick, CTO and Chief Architect at Isomorphic Software, will present a revolutionary model enabled by new technologies. Learn how busine...
Oct. 8, 2015 08:45 PM EDT Reads: 265
There are many considerations when moving applications from on-premise to cloud. It is critical to understand the benefits and also challenges of this migration. A successful migration will result in lower Total Cost of Ownership, yet offer the same or higher level of robustness. Migration to cloud shifts computing resources from your data center, which can yield significant advantages provided that the cloud vendor an offer enterprise-grade quality for your application.
Oct. 8, 2015 08:00 PM EDT Reads: 234
IT data is typically silo'd by the various tools in place. Unifying all the log, metric and event data in one analytics platform stops finger pointing and provides the end-to-end correlation. Logs, metrics and custom event data can be joined to tell the holistic story of your software and operations. For example, users can correlate code deploys to system performance to application error codes.
Oct. 8, 2015 06:45 PM EDT Reads: 208
Internet of Things (IoT) will be a hybrid ecosystem of diverse devices and sensors collaborating with operational and enterprise systems to create the next big application. In their session at @ThingsExpo, Bramh Gupta, founder and CEO of robomq.io, and Fred Yatzeck, principal architect leading product development at robomq.io, discussed how choosing the right middleware and integration strategy from the get-go will enable IoT solution developers to adapt and grow with the industry, while at th...
Oct. 8, 2015 06:00 PM EDT Reads: 2,161
The web app is agile. The REST API is agile. The testing and planning are agile. But alas, data infrastructures certainly are not. Once an application matures, changing the shape or indexing scheme of data often forces at best a top down planning exercise and at worst includes schema changes that force downtime. The time has come for a new approach that fundamentally advances the agility of distributed data infrastructures. Come learn about a new solution to the problems faced by software organ...
Oct. 8, 2015 06:00 PM EDT Reads: 784
The last decade was about virtual machines, but the next one is about containers. Containers enable a service to run on any host at any time. Traditional tools are starting to show cracks because they were not designed for this level of application portability. Now is the time to look at new ways to deploy and manage applications at scale. In his session at @DevOpsSummit, Brian “Redbeard” Harrington, a principal architect at CoreOS, will examine how CoreOS helps teams run in production. Attende...
Oct. 8, 2015 05:45 PM EDT Reads: 1,226
“All our customers are looking at the cloud ecosystem as an important part of their overall product strategy. Some see it evolve as a multi-cloud / hybrid cloud strategy, while others are embracing all forms of cloud offerings like PaaS, IaaS and SaaS in their solutions,” noted Suhas Joshi, Vice President – Technology, at Harbinger Group, in this exclusive Q&A with Cloud Expo Conference Chair Roger Strukhoff.
Oct. 8, 2015 05:00 PM EDT Reads: 440
Can call centers hang up the phones for good? Intuitive Solutions did. WebRTC enabled this contact center provider to eliminate antiquated telephony and desktop phone infrastructure with a pure web-based solution, allowing them to expand beyond brick-and-mortar confines to a home-based agent model. It also ensured scalability and better service for customers, including MUY! Companies, one of the country's largest franchise restaurant companies with 232 Pizza Hut locations. This is one example of...
Oct. 8, 2015 04:30 PM EDT Reads: 7,470
Secure Cloud through Automated Compliance | @CloudExpo @CloudRaxak #Cloud #BigData #DevOps #Microservices
Cloud computing delivers on-demand resources that provide businesses with flexibility and cost-savings. The challenge in moving workloads to the cloud has been the cost and complexity of ensuring the initial and ongoing security and regulatory (PCI, HIPAA, FFIEC) compliance across private and public clouds. Manual security compliance is slow, prone to human error, and represents over 50% of the cost of managing cloud applications. Determining how to automate cloud security compliance is critical...
Oct. 8, 2015 04:00 PM EDT Reads: 304
SYS-CON Events announced today that VividCortex, the monitoring solution for the modern data system, will exhibit at the 17th International Cloud Expo®, which will take place on November 3–5, 2015, at the Santa Clara Convention Center in Santa Clara, CA. The database is the heart of most applications, but it’s also the part that’s hardest to scale, monitor, and optimize even as it’s growing 50% year over year. VividCortex is the first unified suite of database monitoring tools specifically desi...
Oct. 8, 2015 04:00 PM EDT Reads: 465
Saviynt Inc. has announced the availability of the next release of Saviynt for AWS. The comprehensive security and compliance solution provides a Command-and-Control center to gain visibility into risks in AWS, enforce real-time protection of critical workloads as well as data and automate access life-cycle governance. The solution enables AWS customers to meet their compliance mandates such as ITAR, SOX, PCI, etc. by including an extensive risk and controls library to detect known threats and b...
Oct. 8, 2015 03:00 PM EDT Reads: 209
You have your devices and your data, but what about the rest of your Internet of Things story? Two popular classes of technologies that nicely handle the Big Data analytics for Internet of Things are Apache Hadoop and NoSQL. Hadoop is designed for parallelizing analytical work across many servers and is ideal for the massive data volumes you create with IoT devices. NoSQL databases such as Apache HBase are ideal for storing and retrieving IoT data as “time series data.”
Oct. 8, 2015 02:45 PM EDT Reads: 499
Clearly the way forward is to move to cloud be it bare metal, VMs or containers. One aspect of the current public clouds that is slowing this cloud migration is cloud lock-in. Every cloud vendor is trying to make it very difficult to move out once a customer has chosen their cloud. In his session at 17th Cloud Expo, Naveen Nimmu, CEO of Clouber, Inc., will advocate that making the inter-cloud migration as simple as changing airlines would help the entire industry to quickly adopt the cloud wit...
Oct. 8, 2015 02:30 PM EDT Reads: 649