|By Jeremy Geelan||
|June 17, 2009 08:00 AM EDT||
The IT industry is faced with a complexity and affordability crisis – explosive information growth, heavily interconnected and interdependent systems, on average 70% of IT spending going to maintenance, low utilization of resources driving up fixed cost, energy consumption becoming an ever bigger drain on budgets.
At the same time, business needs for flexibility and responsiveness continue unabated. This creates an urgency for enterprises to rethink the way their data centers are set up and managed, and how they receive and deliver services.
Bring this together with advances in technology - from service orientation, automation and service management to virtualization - and you have what Dr Kristof Kloeckner, VP of Cloud Computing Platforms at IBM vividly calls "a perfect storm." Kloeckner was a Keynote speaker at SYS-CON's 3-day 2nd International Cloud Computing Conference & Expo (March 30-April 1, 2009), the industry's leading worldwide Cloud Computing event, now held three times a year, in New York, Silicon Valley, and Europe.
In this interview with Conference Chair Jeremy Geelan, conducted in March, Dr Kloeckner discusses a wide range of Cloud Computing issues and give a very clear insight into IBM's vision with regard to Cloud Services and the substantially improved delivery economics that Cloud Computing is making possible.
Jeremy Geelan: What are the main business drivers for Cloud Computing - for this overall technology trend?
Dr Kristof Kloeckner: In the end, it’s all about money – how much do you spend for just maintaining the status quo, and how much on supporting truly differentiating business initiatives. This drives an imperative for dynamic infrastructures, increasing resource utilization and reducing labor costs, and for more flexible economics in the consumption and delivery of IT based services.
Geelan: And how about from a specifically IBM perspective – what do you think is missing right now from the Cloud Computing Ecosystem, that you can uniquely provide?
Kloeckner: While IBM invented many of the technologies that form the basis of cloud computing (virtualization, for instance, was first implemented in our mainframes), our greatest asset is our deep understanding of our clients, and our experience running the worlds largest data centers.
We are using this experience to build a comprehensive portfolio of cloud related project based services as well as products to build their own clouds, as well as providing cloud delivered services ourselves. Our spans infrastructure services, platform services and application, process and information services.
We also have a strong and long-standing commitment to open environments, and we will work with the industry to ‘keep the clouds open’. This is a major prerequisite for the emergence of a cloud eco-system.
Geelan: How important to IBM strategically is its Blue Cloud Group?
Kloeckner: Well, we are actually calling the new organization that was formed under Erich Clementi “Enterprise Initiatives’, indicating that it brings together all of IBM to build and deliver offerings that enable cloud computing.
Cloud Computing is important to us because the promise of substantially improved delivery economics will have a massive transformative impact on IT based services and business processes. There is a tremendous amount of energy around cloud computing across IBM, and in our clients and partners.
Geelan: What’s the best way, do you think, to define “cloud services” – from an Enterprise IT perspective?
Kloeckner: From a provider perspective, cloud services are characterized through self service, economies of scale and hybrid (public, private and mixed) modes of delivery. Self-service drives client satisfaction and standardization of services. Economies of scale are enabled through large virtualized and automated shared environments, and hybrid delivery models combine external and internal services.
From a user perspective, the most important aspects are ease of use, new economics derived from cost structures that are achieved by greater sharing of resources, and flexible sourcing.
Geelan: How big an issue is security for enterprises who wish to migrate toward this kind of an infrastructure wholly or in part?
Kloeckner: Enterprises have a choice among a spectrum of delivery modes, from private to virtual private to public clouds, and they are making these selections based on workload characteristics. We find many clients opting to keep their most sensitive applications and data private, behind their firewalls (or virtually private with limited access). In these setups, all the existing best practices apply for data and application access and trust and identity management.
As for public clouds it’s important to remember that as in the Web in general, clients need to fully understand the security policies and practices of their providers. I believe that federated identity and trust management will be extremely important here.
Geelan: And what about management, how’s that being taken care of? Can the deployment and management of computing clouds really be automated, or is that in the far-off future still?
Kloeckner: We’re getting there. In February, IBM launched The Tivoli Service Automation Manager, which facilitates dynamic instantiation of cloud delivered services and their management along the entire life cycle, drawing on IBM Service Management capabilities and platform management services.
Geelan: How big a part are standards going to play in the success of the Cloud?
Kloeckner: Standards are essential for customer choice and eco-system growth. We believe the area most important to address is interoperability between clouds and the integration between clouds and other enterprise IT services. Work done on service oriented architecture in recent years will greatly help us address the issue of keeping the clouds open.
Geelan: Tell about the partnerships you just announced with Juniper and Amazon. What do they indicate about the future trajectory of IBM’s endeavors in this area?
Kloeckner: IBM has a broad ecosystem of partners and we have a long history of supporting customer choice. We chose to work with Juniper in this instance of demonstrating connectivity between clouds based on the combination of features, ease of integration and ability to leverage their MPLS technology for secure remote access. Amazon represents yet another venue for IBM to sell its software. We will continue to work with partners to advance the adoption of technologies like cloud computing, and especially to ensure open clouds.
Geelan: Moving beneath the hood for a moment, how does IBM handle the virtualization layer of its Cloud infrastructure?
Kloeckner: What we do depends upon the choice of underlying platform(s). Increasingly, virtualization technologies will be provided as integrated capabilities of the IT resources themselves. This has long been our practice on System z and Power Systems, and overall the industry is moving in this direction. The benefits include greater simplicity, efficiency, resiliency, and security.
Our service management software builds upon these virtualization technologies to provide much greater IT benefits, especially in terms of productivity and agility. Key virtualization-based capabilities of value to Clouds include resource pool ("ensemble") management and virtual resource object management. What sets us apart from others is our strength in management across the diversity of physical and virtual resources (at both the hardware and application levels) - diversity which will continue to increase driven by accelerating innovation.
Geelan: When you unveiled you new cloud strategy at a press conference during Pulse 2009, you underlined that IBM had a great deal to offer smaller businesses, in terms of offering them ready access to best practices and saving them from re-inventing the wheel. What offering/s in particular did you have in mind?
Kloeckner: IBM has a number of cloud offerings that suit small and medium sized businesses well because they offer superior function that would not affordable for smaller businesses to build and run themselves. As an example, LotusLive is a cloud-delivered portfolio of social networking and collaboration services designed for businesses. Launched in January, the service already has 30,000 businesses signed up. As another example, IBM’s Information Protection Services offer enterprise-grade data back up and recovery services to SMB clients like Neighborhood Centers, Allscripts and The Unites States Golf Association. For smaller cloud service providers, IBM’s Resilient Cloud Validation program allows businesses who collaborate with IBM on a rigorous, consistent and proven program of benchmarking and design validation to use the IBM logo: “Resilient Cloud” when marketing their services.
Geelan: Previously you’ve been VP of development for Tivoli, what parts of that experience help you most in formulating IBM’s cloud strategy?
Kloeckner: Tivoli lives in the world of service management and service delivery, so the experience I gained in Tivoli gives me an appreciation of the operational considerations of establishing and running a cloud. Tivoli also works very closely with our Systems and Technology Group and with IBM Research to drive the management of virtualized environments. Clearly, (service) automation and virtualization enabling a dynamic infrastructure, are essential to deliver a large part of the efficiencies and savings clients want to gain from cloud computing. Essentially, our ‘operational support system’, to use service provider terminology, is based on the Tivoli service management portfolio, in particular Tivoli Service Automation Manager.
Geelan: SYS-CON had the pleasure some years ago of interviewing Willy Chiu – who I believe was a colleague of yours – and his vision of HPC seemed already to anticipate much of what we’re now calling cloud computing. How long has IBM in fact been cooking its Cloud in the kitchen?
Kloeckner: While November 15, 2007 marked the official unveiling of IBM’s Blue Cloud initiative, you can find many of the business considerations and technology components that drive and enable cloud computing already as part of our ‘On Demand’ initiative – service orientation, automation, virtualization, and especially the notion that business and technology need to come together to develop transformational force.
As Sam Palmisano defined it in 2005, “On Demand Business is our way of describing a fundamental shift in computing architecture and how it is applied to business — a shift toward integrated solutions and quantifiable business value, not just technology features and functions.” Sounds pretty similar to what folks are saying about cloud today. We are now in the next phase of technology evolution, with a high sense of business urgency.
Geelan: What of the future – what are some of the most interesting infrastructure technologies being developed at IBM right now?
Kloeckner: Within IBM Research and Development, we are working on a number of exciting technologies, for instance management of ensembles of virtualized resources, service life cycle management, multi-tenancy support, image management, tools for development and deployment of services, the whole notion of ‘connectivity as a service’, to name just a few. We are also learning a lot from direct engagements with advanced clients, and working on application areas that can benefit from the cloud, like analytics or massive event processing.
As a general remark, we are seeing more ‘smart’ applications emerging in an interconnected world of ‘intelligent’, instrumented systems, in industries like energy and utilities, health care, logistics and many others. We believe that many of these applications will need clouds for efficient delivery.
Geelan: 2009 is a year of obvious challenges, from both a CapEx and an OpEx perspective, for anyone involved with Enterprise IT. Finally, what’s your top tip, as a seasoned software executive, to those other CTOs out there right now – especially CTOs of embattled start-ups who may be looking for some magic bullet to ensure they’re alive (and well) as a company in 2010?
Kloeckner: Take a careful look at the challenges and opportunities that cloud computing offers in your specific situation, develop a strategy and choose a strong partner for implementation. We are confident that IBM has much to offer in this space…
VictorOps is making on-call suck less with the only collaborative alert management platform on the market. With easy on-call scheduling management, a real-time incident timeline that gives you contextual relevance around your alerts and powerful reporting features that make post-mortems more effective, VictorOps helps your IT/DevOps team solve problems faster.
Mar. 1, 2015 05:00 PM EST Reads: 1,220
Skeuomorphism usually means retaining existing design cues in something new that doesn’t actually need them. However, the concept of skeuomorphism can be thought of as relating more broadly to applying existing patterns to new technologies that, in fact, cry out for new approaches. In his session at DevOps Summit, Gordon Haff, Senior Cloud Strategy Marketing and Evangelism Manager at Red Hat, will discuss why containers should be paired with new architectural practices such as microservices ra...
Mar. 1, 2015 04:00 PM EST Reads: 1,445
Roberto Medrano, Executive Vice President at SOA Software, had reached 30,000 page views on his home page - http://RobertoMedrano.SYS-CON.com/ - on the SYS-CON family of online magazines, which includes Cloud Computing Journal, Internet of Things Journal, Big Data Journal, and SOA World Magazine. He is a recognized executive in the information technology fields of SOA, internet security, governance, and compliance. He has extensive experience with both start-ups and large companies, having been ...
Mar. 1, 2015 04:00 PM EST Reads: 1,234
The industrial software market has treated data with the mentality of “collect everything now, worry about how to use it later.” We now find ourselves buried in data, with the pervasive connectivity of the (Industrial) Internet of Things only piling on more numbers. There’s too much data and not enough information. In his session at @ThingsExpo, Bob Gates, Global Marketing Director, GE’s Intelligent Platforms business, to discuss how realizing the power of IoT, software developers are now focu...
Mar. 1, 2015 03:15 PM EST Reads: 1,352
Operational Hadoop and the Lambda Architecture for Streaming Data Apache Hadoop is emerging as a distributed platform for handling large and fast incoming streams of data. Predictive maintenance, supply chain optimization, and Internet-of-Things analysis are examples where Hadoop provides the scalable storage, processing, and analytics platform to gain meaningful insights from granular data that is typically only valuable from a large-scale, aggregate view. One architecture useful for capturing...
Mar. 1, 2015 02:00 PM EST Reads: 1,338
SYS-CON Events announced today that Vitria Technology, Inc. will exhibit at SYS-CON’s @ThingsExpo, which will take place on June 9-11, 2015, at the Javits Center in New York City, NY. Vitria will showcase the company’s new IoT Analytics Platform through live demonstrations at booth #330. Vitria’s IoT Analytics Platform, fully integrated and powered by an operational intelligence engine, enables customers to rapidly build and operationalize advanced analytics to deliver timely business outcomes ...
Mar. 1, 2015 01:45 PM EST Reads: 1,213
DevOps is about increasing efficiency, but nothing is more inefficient than building the same application twice. However, this is a routine occurrence with enterprise applications that need both a rich desktop web interface and strong mobile support. With recent technological advances from Isomorphic Software and others, it is now feasible to create a rich desktop and tuned mobile experience with a single codebase, without compromising performance or usability.
Mar. 1, 2015 01:15 PM EST Reads: 1,112
SYS-CON Events announced today Arista Networks will exhibit at SYS-CON's DevOps Summit 2015 New York, which will take place June 9-11, 2015, at the Javits Center in New York City, NY. Arista Networks was founded to deliver software-driven cloud networking solutions for large data center and computing environments. Arista’s award-winning 10/40/100GbE switches redefine scalability, robustness, and price-performance, with over 3,000 customers and more than three million cloud networking ports depl...
Mar. 1, 2015 01:00 PM EST Reads: 1,559
The speed of software changes in growing and large scale rapid-paced DevOps environments presents a challenge for continuous testing. Many organizations struggle to get this right. Practices that work for small scale continuous testing may not be sufficient as the requirements grow. In his session at DevOps Summit, Marc Hornbeek, Sr. Solutions Architect of DevOps continuous test solutions at Spirent Communications, will explain the best practices of continuous testing at high scale, which is r...
Mar. 1, 2015 01:00 PM EST Reads: 1,189
Thanks to Docker, it becomes very easy to leverage containers to build, ship, and run any Linux application on any kind of infrastructure. Docker is particularly helpful for microservice architectures because their successful implementation relies on a fast, efficient deployment mechanism – which is precisely one of the features of Docker. Microservice architectures are therefore becoming more popular, and are increasingly seen as an interesting option even for smaller projects, instead of bein...
Mar. 1, 2015 12:00 PM EST Reads: 2,563
Security can create serious friction for DevOps processes. We've come up with an approach to alleviate the friction and provide security value to DevOps teams. In her session at DevOps Summit, Shannon Lietz, Senior Manager of DevSecOps at Intuit, will discuss how DevSecOps got started and how it has evolved. Shannon Lietz has over two decades of experience pursuing next generation security solutions. She is currently the DevSecOps Leader for Intuit where she is responsible for setting and driv...
Mar. 1, 2015 12:00 PM EST Reads: 2,369
The explosion of connected devices / sensors is creating an ever-expanding set of new and valuable data. In parallel the emerging capability of Big Data technologies to store, access, analyze, and react to this data is producing changes in business models under the umbrella of the Internet of Things (IoT). In particular within the Insurance industry, IoT appears positioned to enable deep changes by altering relationships between insurers, distributors, and the insured. In his session at @Things...
Mar. 1, 2015 12:00 PM EST Reads: 1,266
SYS-CON Events announced today that Open Data Centers (ODC), a carrier-neutral colocation provider, will exhibit at SYS-CON's 16th International Cloud Expo®, which will take place June 9-11, 2015, at the Javits Center in New York City, NY. Open Data Centers is a carrier-neutral data center operator in New Jersey and New York City offering alternative connectivity options for carriers, service providers and enterprise customers.
Mar. 1, 2015 12:00 PM EST Reads: 1,916
In his session at DevOps Summit, Tapabrata Pal, Director of Enterprise Architecture at Capital One, will tell a story about how Capital One has embraced Agile and DevOps Security practices across the Enterprise – driven by Enterprise Architecture; bringing in Development, Operations and Information Security organizations together. Capital Ones DevOpsSec practice is based upon three "pillars" – Shift-Left, Automate Everything, Dashboard Everything. Within about three years, from 100% waterfall, C...
Mar. 1, 2015 11:00 AM EST Reads: 2,723
Even as cloud and managed services grow increasingly central to business strategy and performance, challenges remain. The biggest sticking point for companies seeking to capitalize on the cloud is data security. Keeping data safe is an issue in any computing environment, and it has been a focus since the earliest days of the cloud revolution. Understandably so: a lot can go wrong when you allow valuable information to live outside the firewall. Recent revelations about government snooping, along...
Mar. 1, 2015 11:00 AM EST Reads: 6,961
PubNub on Monday has announced that it is partnering with IBM to bring its sophisticated real-time data streaming and messaging capabilities to Bluemix, IBM’s cloud development platform. “Today’s app and connected devices require an always-on connection, but building a secure, scalable solution from the ground up is time consuming, resource intensive, and error-prone,” said Todd Greene, CEO of PubNub. “PubNub enables web, mobile and IoT developers building apps on IBM Bluemix to quickly add sc...
Mar. 1, 2015 10:00 AM EST Reads: 4,755
Data-intensive companies that strive to gain insights from data using Big Data analytics tools can gain tremendous competitive advantage by deploying data-centric storage. Organizations generate large volumes of data, the vast majority of which is unstructured. As the volume and velocity of this unstructured data increases, the costs, risks and usability challenges associated with managing the unstructured data (regardless of file type, size or device) increases simultaneously, including end-to-...
Mar. 1, 2015 09:45 AM EST Reads: 2,205
The excitement around the possibilities enabled by Big Data is being tempered by the daunting task of feeding the analytics engines with high quality data on a continuous basis. As the once distinct fields of data integration and data management increasingly converge, cloud-based data solutions providers have emerged that can buffer your organization from the complexities of this continuous data cleansing and management so that you’re free to focus on the end goal: actionable insight.
Mar. 1, 2015 09:30 AM EST Reads: 1,670
Between the compelling mockups and specs produced by your analysts and designers, and the resulting application built by your developers, there is a gulf where projects fail, costs spiral out of control, and applications fall short of requirements. In his session at DevOps Summit, Charles Kendrick, CTO and Chief Architect at Isomorphic Software, will present a new approach where business and development users collaborate – each using tools appropriate to their goals and expertise – to build mo...
Mar. 1, 2015 09:00 AM EST Reads: 2,913
The Internet of Things (IoT) is causing data centers to become radically decentralized and atomized within a new paradigm known as “fog computing.” To support IoT applications, such as connected cars and smart grids, data centers' core functions will be decentralized out to the network's edges and endpoints (aka “fogs”). As this trend takes hold, Big Data analytics platforms will focus on high-volume log analysis (aka “logs”) and rely heavily on cognitive-computing algorithms (aka “cogs”) to mak...
Mar. 1, 2015 09:00 AM EST Reads: 1,057