|June 6, 2010 08:15 AM EDT||
The cloud essentially "consumerizes" all of IT, not just relatively unimportant bits like procuring personal hard- and software. This requires a whole rethinking of corporate IT, as the idea of any master design becomes unattainable. How can IT as a species survive this trend as it may render the education of a whole generation of IT-ers irrelevant? On the brighter side - it really caters for the talents of today's teenagers: consumption as a lifestyle.
The idea of consumerization - users being allowed to freely procure their own personal hard- and software - has been around for a while. But few CIOs and even less heads of IT Operations have embraced it. Other than some token adoption, where users could choose between an iPhone or a Blackberry or where users got a personal budget to order from the company supplied catalog of pre-approved hardware, we see little adoption of the concept. The idea is that users can go to any consumer store or webshop and order any gadget they like, be it an iPad, laptops, printer or smart phone and configure these basically while still in the store to access their corporate mail, intranet and company applications. The idea originated when people wanted to use their 24 inch HD pc with 4 processors and mega memory - all essential to enjoy modern home entertainment and video and far superior to company standard issue equipment- to also do some work.
Cloud computing now makes such a consumer approach also possible at the departmental level. Department selecting and using non corporate approved or endorsed SaaS based CRM applications are the most commonly used example. But more interesting are the cases where departments - tired of waiting for their turn in the never reducing application backlog of corporate IT - turned to a system integrator to build a custom cloud application to meet their immediate needs. Several system integrators indicate that they have more and more projects where no longer IT, but the business department, is their prime customer. Contracts, SLA's and even integrations are negotiated directly between the SI and the business department, in some cases IT is not even involved or aware.
So back to consumerisation. Although the trend has been far from whole heartily embraced by most corporate IT, it is continuing. In my direct environment I see several people who, instead of plugging their laptop into the corporate network at the office, take a 3G network stick to work. For around 20 Euros a month this gives them better performance accessing the applications they care about, not to mention it gives them access to applications most corporate IT department do not care for, like facebook, twitter, etc. Question is off course, can they do their work like that? Don't they need all day, full time access to the aforementioned fully vertically integrated ERP system? The answer is No. First of all, the vertically integrated type of enterprise that ERP was intended for, no longer exist. Most corporations have taken to outsourcing distribution to DHL or TNT, employee travel to the likes of American Express, HR payroll and expenses to XYZ, etc. etc. The list goes on and on.
All these external service providers support these services with web based systems that can be accessed from anywhere, inside and outside the company firewall. At the same time, the remaining processes that occur in the corporate ERP system are so integrated that they hardly require any manual intervention from employees. Consequently employees don't need to spend their time doing data entry or even data updates or analysis on that system. Any remaining required interaction is facilitated by directly interfacing with the customer via the web shop or via other web based systems. One could say that the world moved from vertically integrated manufacturing corporations to supply chain connected extended enterprises.
The question I will address in my next post is how does the cloud enabling consumerisation for enterprise applications play a role in this and what this means for IT moving forward.
On the supply side of IT, it means applications are best delivered as easily consumerable services to employees and others (partners, customers, suppliers). One large European multinational is already delivering all their new applications as internet (so not intranet) applications. Meaning any application can be accessed from anywhere by simply entering a URL and doing proper authentication. Choosing which applications to provide internally is based on whether there are outside parties willing and capable to provide these services or whether the company can gain a distinct advantage by providing the service themselves.
When speaking about consuming services, one should try and think broader than just IT services. The head of distribution may be looking for a parcel tracking system, but when asking the CEO or the COO they are more likely to think of a service in terms of something a DHL or TNT delivers. Services such as warehousing, distribution, but also complaint tracking, returns and repairs, or even accounting, marketing and reselling, all including the associate IT parts of those services. It is the idea of everything as a services, but on steroids (XaaSoS). Please note that even when an organization decides to provide one of these services internally, they can still source the underlying infrastructure and even applications "as a service" externally (this last scenario strangely enough is what many an IT person seems to think of exclusively when discussing cloud computing).
On the demand side of IT the issue is an altogether other one. How do we warrant continuity, efficiency and compliance, in such a consumption oriented IT World. If it is every man (or department) for themself, how do we prevent suboptimisation, In fact , how do we even know what is going on in the first place. How do we know what services are being consumed. This is the new challenge, and it is very similar to what companies faced when they decided to not manufacture everything themselves anymore, abandoning vertical integration where it made sense and taking a "supply chain" approach. Cloud computing is in many aspects a similar movement, and also here a supply chain approach looks like the way to go.
Such a supply chain approach means thoroughly understanding both demand and supply, matching the two and making sure that the goods - or in this case services - reach the right audience at the right time (on demand). IT has invested a fair amount of time and effort in better ways and methodologies to understand demand. On the supply side, IT till now assumed they were the supplier. In that role they used industry analysts to classify the components required, such as hardware and software. In this new world they need to start thoroughly understanding the full services that are available on the market. An interesting effort worth mentioning here is the SMI (Service Measurement Index) an approach to classify cloud services co-initiated by my employer, CA technologies and lead by Carnegie Mellon University.
After having gained an understanding of both demand and supply, the remaining task is "connecting the dots". This sounds trivial but is an activity that analysts estimate becoming a multi-billion industry within just a few years. It includes non-trivial tasks like identifying which users are allowed to do which tasks in this now open environment and optimizing the processes by picking resources that have the lowest utilization and thus cost. Because going forward scarcity will determine price especially in the new cloud world (which resembles Adam Smith's idea of a perfect open market a lot closer than any internal IT department ever did or will do).
Now off course all of the above won't happen overnight. Many a reader (and with a little luck the author) will have retired by the time today's vertically integrated systems - many of which are several decades old and based on solid, reliable mainframes - will have become services that are brokered in an open cloud market. A couple of high profile outages may even prolong this a generation or two more. But long term I see no other way. Other markets (electricity, electronics, publishing and even healthcare) have taken or are taking the same path. It is the era of consumption.
PS Short term, however, the thing we (IT) probably need most is a new diagraming technique. Why? From the above it will be clear that - in such a consumerised world - architecture diagrams are a thing of the past. And an IT person without a diagram is like a fish without water . We need something that allows us to evolve our IT fins into feet and our IT chews into lungs, so we can transition from water to land and not become extinct in the process. One essential aspect will be that unlike pictures of clouds and very much like real clouds, the diagrams will need to be able to change dynamically, much like pictures in a Harry Potter movie (it's magic). Who has a suggestion for such a technique ?
[Editorial note: This blog originally was published at ITSMportal.com on May 31st , 2010]
There are many considerations when moving applications from on-premise to cloud. It is critical to understand the benefits and also challenges of this migration. A successful migration will result in lower Total Cost of Ownership, yet offer the same or higher level of robustness. In his session at 15th Cloud Expo, Michael Meiner, an Engineering Director at Oracle, Corporation, will analyze a range of cloud offerings (IaaS, PaaS, SaaS) and discuss the benefits/challenges of migrating to each of...
Mar. 2, 2015 03:30 PM EST
Containers and microservices have become topics of intense interest throughout the cloud developer and enterprise IT communities. Accordingly, attendees at the upcoming 16th Cloud Expo at the Javits Center in New York June 9-11 will find fresh new content in a new track called PaaS | Containers & Microservices Containers are not being considered for the first time by the cloud community, but a current era of re-consideration has pushed them to the top of the cloud agenda. With the launch ...
Mar. 2, 2015 03:15 PM EST Reads: 524
The industrial software market has treated data with the mentality of “collect everything now, worry about how to use it later.” We now find ourselves buried in data, with the pervasive connectivity of the (Industrial) Internet of Things only piling on more numbers. There’s too much data and not enough information. In his session at @ThingsExpo, Bob Gates, Global Marketing Director, GE’s Intelligent Platforms business, to discuss how realizing the power of IoT, software developers are now focu...
Mar. 2, 2015 03:15 PM EST Reads: 1,415
Operational Hadoop and the Lambda Architecture for Streaming Data Apache Hadoop is emerging as a distributed platform for handling large and fast incoming streams of data. Predictive maintenance, supply chain optimization, and Internet-of-Things analysis are examples where Hadoop provides the scalable storage, processing, and analytics platform to gain meaningful insights from granular data that is typically only valuable from a large-scale, aggregate view. One architecture useful for capturing...
Mar. 2, 2015 02:00 PM EST Reads: 1,401
SYS-CON Events announced today that Vitria Technology, Inc. will exhibit at SYS-CON’s @ThingsExpo, which will take place on June 9-11, 2015, at the Javits Center in New York City, NY. Vitria will showcase the company’s new IoT Analytics Platform through live demonstrations at booth #330. Vitria’s IoT Analytics Platform, fully integrated and powered by an operational intelligence engine, enables customers to rapidly build and operationalize advanced analytics to deliver timely business outcomes ...
Mar. 2, 2015 01:45 PM EST Reads: 1,270
DevOps is about increasing efficiency, but nothing is more inefficient than building the same application twice. However, this is a routine occurrence with enterprise applications that need both a rich desktop web interface and strong mobile support. With recent technological advances from Isomorphic Software and others, it is now feasible to create a rich desktop and tuned mobile experience with a single codebase, without compromising performance or usability.
Mar. 2, 2015 01:15 PM EST Reads: 1,194
The speed of software changes in growing and large scale rapid-paced DevOps environments presents a challenge for continuous testing. Many organizations struggle to get this right. Practices that work for small scale continuous testing may not be sufficient as the requirements grow. In his session at DevOps Summit, Marc Hornbeek, Sr. Solutions Architect of DevOps continuous test solutions at Spirent Communications, will explain the best practices of continuous testing at high scale, which is r...
Mar. 2, 2015 01:00 PM EST Reads: 1,253
SYS-CON Events announced today Arista Networks will exhibit at SYS-CON's DevOps Summit 2015 New York, which will take place June 9-11, 2015, at the Javits Center in New York City, NY. Arista Networks was founded to deliver software-driven cloud networking solutions for large data center and computing environments. Arista’s award-winning 10/40/100GbE switches redefine scalability, robustness, and price-performance, with over 3,000 customers and more than three million cloud networking ports depl...
Mar. 2, 2015 01:00 PM EST Reads: 1,587
HP and Aruba Networks on Monday announced a definitive agreement for HP to acquire Aruba, a provider of next-generation network access solutions for the mobile enterprise, for $24.67 per share in cash. The equity value of the transaction is approximately $3.0 billion, and net of cash and debt approximately $2.7 billion. Both companies' boards of directors have approved the deal. "Enterprises are facing a mobile-first world and are looking for solutions that help them transition legacy investme...
Mar. 2, 2015 12:45 PM EST Reads: 562
SYS-CON Events announced today that Open Data Centers (ODC), a carrier-neutral colocation provider, will exhibit at SYS-CON's 16th International Cloud Expo®, which will take place June 9-11, 2015, at the Javits Center in New York City, NY. Open Data Centers is a carrier-neutral data center operator in New Jersey and New York City offering alternative connectivity options for carriers, service providers and enterprise customers.
Mar. 2, 2015 12:00 PM EST Reads: 1,960
Thanks to Docker, it becomes very easy to leverage containers to build, ship, and run any Linux application on any kind of infrastructure. Docker is particularly helpful for microservice architectures because their successful implementation relies on a fast, efficient deployment mechanism – which is precisely one of the features of Docker. Microservice architectures are therefore becoming more popular, and are increasingly seen as an interesting option even for smaller projects, instead of bein...
Mar. 2, 2015 12:00 PM EST Reads: 2,613
Security can create serious friction for DevOps processes. We've come up with an approach to alleviate the friction and provide security value to DevOps teams. In her session at DevOps Summit, Shannon Lietz, Senior Manager of DevSecOps at Intuit, will discuss how DevSecOps got started and how it has evolved. Shannon Lietz has over two decades of experience pursuing next generation security solutions. She is currently the DevSecOps Leader for Intuit where she is responsible for setting and driv...
Mar. 2, 2015 12:00 PM EST Reads: 2,403
The explosion of connected devices / sensors is creating an ever-expanding set of new and valuable data. In parallel the emerging capability of Big Data technologies to store, access, analyze, and react to this data is producing changes in business models under the umbrella of the Internet of Things (IoT). In particular within the Insurance industry, IoT appears positioned to enable deep changes by altering relationships between insurers, distributors, and the insured. In his session at @Things...
Mar. 2, 2015 12:00 PM EST Reads: 1,345
SYS-CON Media announced that IBM, which offers the world’s deepest portfolio of technologies and expertise that are transforming the future of work, has launched ad campaigns on SYS-CON’s numerous online magazines such as Cloud Computing Journal, Virtualization Journal, SOA World Magazine, and IoT Journal. IBM’s campaigns focus on vendors in the technology marketplace, the future of testing, Big Data and analytics, and mobile platforms.
Mar. 2, 2015 11:00 AM EST Reads: 1,001
Even as cloud and managed services grow increasingly central to business strategy and performance, challenges remain. The biggest sticking point for companies seeking to capitalize on the cloud is data security. Keeping data safe is an issue in any computing environment, and it has been a focus since the earliest days of the cloud revolution. Understandably so: a lot can go wrong when you allow valuable information to live outside the firewall. Recent revelations about government snooping, along...
Mar. 2, 2015 11:00 AM EST Reads: 7,032
In his session at DevOps Summit, Tapabrata Pal, Director of Enterprise Architecture at Capital One, will tell a story about how Capital One has embraced Agile and DevOps Security practices across the Enterprise – driven by Enterprise Architecture; bringing in Development, Operations and Information Security organizations together. Capital Ones DevOpsSec practice is based upon three "pillars" – Shift-Left, Automate Everything, Dashboard Everything. Within about three years, from 100% waterfall, C...
Mar. 2, 2015 11:00 AM EST Reads: 2,766
PubNub on Monday has announced that it is partnering with IBM to bring its sophisticated real-time data streaming and messaging capabilities to Bluemix, IBM’s cloud development platform. “Today’s app and connected devices require an always-on connection, but building a secure, scalable solution from the ground up is time consuming, resource intensive, and error-prone,” said Todd Greene, CEO of PubNub. “PubNub enables web, mobile and IoT developers building apps on IBM Bluemix to quickly add sc...
Mar. 2, 2015 10:00 AM EST Reads: 4,850
Data-intensive companies that strive to gain insights from data using Big Data analytics tools can gain tremendous competitive advantage by deploying data-centric storage. Organizations generate large volumes of data, the vast majority of which is unstructured. As the volume and velocity of this unstructured data increases, the costs, risks and usability challenges associated with managing the unstructured data (regardless of file type, size or device) increases simultaneously, including end-to-...
Mar. 2, 2015 09:45 AM EST Reads: 2,335
The excitement around the possibilities enabled by Big Data is being tempered by the daunting task of feeding the analytics engines with high quality data on a continuous basis. As the once distinct fields of data integration and data management increasingly converge, cloud-based data solutions providers have emerged that can buffer your organization from the complexities of this continuous data cleansing and management so that you’re free to focus on the end goal: actionable insight.
Mar. 2, 2015 09:30 AM EST Reads: 1,692
With several hundred implementations of IoT-enabled solutions in the past 12 months alone, this session will focus on experience over the art of the possible. Many can only imagine the most advanced telematics platform ever deployed, supporting millions of customers, producing tens of thousands events or GBs per trip, and hundreds of TBs per month. With the ability to support a billion sensor events per second, over 30PB of warm data for analytics, and hundreds of PBs for an data analytics arc...
Mar. 2, 2015 09:00 AM EST Reads: 1,348