|By Tony Bishop||
|May 17, 2011 06:45 AM EDT||
This is part 3 of the continuing article "From Legacy to the Cloud: Next Generation Datacenters"
Restating the obvious - Business and IT have planned, designed and operated business systems in a silo-oriented manner with disconnected decisions and actions as to how the business needs to executes and systems are created, implemented and actually operate.
In the first two articles in this series I provided details as to the problems with legacy datacenter models & started laying out a proven method to attack the legacy problems by implementing a day in the life approach to get the truth of what is happening from the business to the datacenter and/or what decisions and actions have been taken to cause the day in the life to not execute as expected.
This article will focus on the ALIGNMENT Step of the four-step strategy for attacking legacy IT will be described.
Insight - ‘day in the life' understanding of both business and IT in terms of execution, historical decisions and actions and objective understanding of the actual situation as it exists today.
Alignment - creating a common language, taxonomy and system creation model to produce repeatable results.
Control - specific actions, tooling, process and approach to ensure the appropriate change occurs and re-occurs successfully.
Sustainment - mechanisms and processes instituted in a repeatable discipline to ensure consistent results and avoid falling back into the legacy traps.
Step 2 - Alignment
Recap from Step 1
In Step 1, we gathered day in the life information on an end-to-end basis. We hit on the key linkages of a business value chain tied to an IT supply chain. By profiling, mapping, modeling and correlating in an objective data driven manner - we now have assembled the input data to our alignment-processing step. In particular, we hit on four key actions of data collection that will not only serve in the transformation process but must become part of a sustainment process to ensure continual alignment between business and IT going forward.
The four key data principles are:
- "Know how you make money"
- "Know what fulfills your money making process"
- "Know your IT supply chain components & usage"
- "Know who & how you operate your IT supply chain"
Organizations who have been successful in migrating legacy datacenter infrastructures to next generation cloud like models have employed and adopted a critical strategy of "Systems Thinking".
Wikipedia Definition- Systems Thinking has been defined as an approach to problem solving, by viewing "problems" as parts of an overall system. In organizations, systems consist of people, structures and processes that work together to make an organization healthy or unhealthy. Systems thinking is not one thing but a set of habits or practices within a framework that is based on the belief that the component parts of a system can best understood in the context of relationships with each other and with other systems, rather than in isolation.
If we go back to the problem I covered in the first blog - most organizations (who are the ones with extreme legacy infrastructure problems) have not planned, design or created their specific applications and infrastructure systems with a strategic "System-Thinking" approach. Systems Thinking is ‘the' key or missing component to creating a successful formula that ensures alignment between Business and IT on the journey to the next generation datacenter in the cloud. That formula is defined below.
Alignment Formula - Insight Data Points + Systems Thinking + 80/20 Rule + Capability Oriented Taxonomy.
Insight Data Points - create the inputs and boundaries for decision-making. These data inputs are qualification & quantification of: Business Drivers X Business Behaviors x Key Performance Indicators by Business Function x Business Value Chain x Workload Qualities x Business Events x IT Supply Chain Qualities & Behaviors.
Systems Thinking - is the framework for the design decision-making process. Holistic end to end, top to bottom design thought process from business user to IT Rack on specific floor space with specific power distribution, cabling and cooling strategy translating to stated quality of execution required - i.e. Performance, cost, agility, availability, fungability, etc...
80/20 Rule - a critical method to get around scope creep and paralysis/analysis. With proper Insight data gathering that captures business behaviors profiled and trended correlated with IT supply chain behaviors (again think day in the life) there becomes a portfolio profile where 20% of the business functions consume close to 80% of the IT supply chain on a normalized basis. Typically these are client functions, revenue functions, regulatory functions and business risk related functions. Attack anything new in these functions with next generation strategy. Quarantine any legacy that is not fitting this profile and modernize any existing systems that fit the profile.
Capabilities Oriented Taxonomy - build it, buy it, borrow it - but establish a common taxonomy that Business and IT must think and speak. What are the business capabilities that generate revenue, the require IT supply chain capabilities to capture data, analyze or process the data, share information or content, update information or content and store the various information or content in a day in the life? This is critical to keep everyone aligned and create an environment of collaborative innovation. (Hint - borrow from Six Sigma, LEAN, SOA, EA, ISO, etc... that takes you from user to datacenter facilities. If you don't have a repository nearing 1000 discrete categories you probably have an incomplete taxonomy.)
Note - this is an iterative process where firms must crawl, walk, run in terms of defining and building out. Key is to get classification, definition and behavior of speaking and thinking in this manner between Business and IT that is critical.
Tooling & Approach
This formula needs to be implemented in a structured and well-managed manner. A transformation office should be established that incorporates Business and IT resources allocated/dedicated.
Getting started organizations need to create their library. This library needs to incorporate the Day in the Life Modeling & Profiling, Inventory of Business Systems Deployment Attributes and Interdependencies Profiled.
Organizations need to establish a Portfolio Decision Matrix that maintains all decisions and actions agreed to business and IT when applying the Alignment Formula.
A Technology Product Catalogue needs to be created/refined in a manner that is capability oriented and gaps identified. This is the master artifact for which the IT Supply Chain will be transformed against/measured against/benchmarked against.
A roadmap needs to be created of current state to target state systems & supply chain evolution by capability. When creating think of peeling an onion. Each layer has a purpose. In this case, each layer of the roadmap needs to communicate to each audience member.
Day in the Life Modeling needs to be translated into various Use Cases (Business functionality to technical enablement). These use cases need to be mapped to patterns (business and data patterns to application and workload patterns). The patterns then need to map to capability deployment strategies (how logical patterns will manifest themselves into physical realities). The key here is to use this modeling & translation effort as your CHANGE VEHICLE that ensures actions realize the alignment decisions made.
Next organizations must establish an engineering deployment model discipline. Here the translated output defined in the day in the life modeling above needs to be incorporated into IT supply chain engineering deployment that is linked to automation assembly and discrete policies for enactment.
The final tooling and process component is a communication strategy that incorporates methods, frequencies and playbooks tuned to your specific organization learning culture. Here, you must figure out how to use concepts like town-halls, executive briefings, status/progress reports, decision arbitration board, and detailed playbooks (think yellow book guide for dummies) that outline the steps being taken, decisions made/to be made, achievements accomplished and why things are being done or not.
If you are still reading at this point - you are ready to transform! The Next step that will be covered in the next article is to outline the processes and tooling is necessary to "control" and effect the change to your next generation datacenter.
The Internet of Things will challenge the status quo of how IT and development organizations operate. Or will it? Certainly the fog layer of IoT requires special insights about data ontology, security and transactional integrity. But the developmental challenges are the same: People, Process and Platform. In his session at @ThingsExpo, Craig Sproule, CEO of Metavine, demonstrated how to move beyond today's coding paradigm and shared the must-have mindsets for removing complexity from the develo...
Jul. 31, 2016 03:15 AM EDT Reads: 1,690
In today's uber-connected, consumer-centric, cloud-enabled, insights-driven, multi-device, global world, the focus of solutions has shifted from the product that is sold to the person who is buying the product or service. Enterprises have rebranded their business around the consumers of their products. The buyer is the person and the focus is not on the offering. The person is connected through multiple devices, wearables, at home, on the road, and in multiple locations, sometimes simultaneously...
Jul. 31, 2016 01:30 AM EDT Reads: 1,014
Basho Technologies has announced the latest release of Basho Riak TS, version 1.3. Riak TS is an enterprise-grade NoSQL database optimized for Internet of Things (IoT). The open source version enables developers to download the software for free and use it in production as well as make contributions to the code and develop applications around Riak TS. Enhancements to Riak TS make it quick, easy and cost-effective to spin up an instance to test new ideas and build IoT applications. In addition to...
Jul. 31, 2016 12:45 AM EDT Reads: 2,043
Identity is in everything and customers are looking to their providers to ensure the security of their identities, transactions and data. With the increased reliance on cloud-based services, service providers must build security and trust into their offerings, adding value to customers and improving the user experience. Making identity, security and privacy easy for customers provides a unique advantage over the competition.
Jul. 30, 2016 11:30 PM EDT Reads: 1,347
CenturyLink has announced that application server solutions from GENBAND are now available as part of CenturyLink’s Networx contracts. The General Services Administration (GSA)’s Networx program includes the largest telecommunications contract vehicles ever awarded by the federal government. CenturyLink recently secured an extension through spring 2020 of its offerings available to federal government agencies via GSA’s Networx Universal and Enterprise contracts. GENBAND’s EXPERiUS™ Application...
Jul. 30, 2016 10:30 PM EDT Reads: 1,947
SYS-CON Events announced today that Isomorphic Software will exhibit at DevOps Summit at 19th International Cloud Expo, which will take place on November 1–3, 2016, at the Santa Clara Convention Center in Santa Clara, CA. Isomorphic Software provides the SmartClient HTML5/AJAX platform, the most advanced technology for building rich, cutting-edge enterprise web applications for desktop and mobile. SmartClient combines the productivity and performance of traditional desktop software with the simp...
Jul. 30, 2016 09:45 PM EDT Reads: 1,324
"We've discovered that after shows 80% if leads that people get, 80% of the conversations end up on the show floor, meaning people forget about it, people forget who they talk to, people forget that there are actual business opportunities to be had here so we try to help out and keep the conversations going," explained Jeff Mesnik, Founder and President of ContentMX, in this SYS-CON.tv interview at 18th Cloud Expo, held June 7-9, 2016, at the Javits Center in New York City, NY.
Jul. 30, 2016 09:45 PM EDT Reads: 1,502
"When you think about the data center today, there's constant evolution, The evolution of the data center and the needs of the consumer of technology change, and they change constantly," stated Matt Kalmenson, VP of Sales, Service and Cloud Providers at Veeam Software, in this SYS-CON.tv interview at 18th Cloud Expo, held June 7-9, 2016, at the Javits Center in New York City, NY.
Jul. 30, 2016 09:30 PM EDT Reads: 1,538
Redis is not only the fastest database, but it is the most popular among the new wave of databases running in containers. Redis speeds up just about every data interaction between your users or operational systems. In his session at 19th Cloud Expo, Dave Nielsen, Developer Advocate, Redis Labs, will share the functions and data structures used to solve everyday use cases that are driving Redis' popularity.
Jul. 30, 2016 07:30 PM EDT Reads: 1,768
Internet of @ThingsExpo, taking place November 1-3, 2016, at the Santa Clara Convention Center in Santa Clara, CA, is co-located with the 19th International Cloud Expo and will feature technical sessions from a rock star conference faculty and the leading industry players in the world and ThingsExpo Silicon Valley Call for Papers is now open.
Jul. 30, 2016 07:00 PM EDT Reads: 2,779
You think you know what’s in your data. But do you? Most organizations are now aware of the business intelligence represented by their data. Data science stands to take this to a level you never thought of – literally. The techniques of data science, when used with the capabilities of Big Data technologies, can make connections you had not yet imagined, helping you discover new insights and ask new questions of your data. In his session at @ThingsExpo, Sarbjit Sarkaria, data science team lead ...
Jul. 30, 2016 05:00 PM EDT Reads: 1,313
"My role is working with customers, helping them go through this digital transformation. I spend a lot of time talking to banks, big industries, manufacturers working through how they are integrating and transforming their IT platforms and moving them forward," explained William Morrish, General Manager Product Sales at Interoute, in this SYS-CON.tv interview at 18th Cloud Expo, held June 7-9, 2016, at the Javits Center in New York City, NY.
Jul. 30, 2016 04:30 PM EDT Reads: 2,284
[webcast] Continuous Delivery in the Enterprise | @DevOpsSummit @IBMDevOps #IBM #DevOps #ContinuousDelivery
To leverage Continuous Delivery, enterprises must consider impacts that span functional silos, as well as applications that touch older, slower moving components. Managing the many dependencies can cause slowdowns. See how to achieve continuous delivery in the enterprise.
Jul. 30, 2016 04:30 PM EDT Reads: 595
Up until last year, enterprises that were looking into cloud services usually undertook a long-term pilot with one of the large cloud providers, running test and dev workloads in the cloud. With cloud’s transition to mainstream adoption in 2015, and with enterprises migrating more and more workloads into the cloud and in between public and private environments, the single-provider approach must be revisited. In his session at 18th Cloud Expo, Yoav Mor, multi-cloud solution evangelist at Cloudy...
Jul. 30, 2016 04:15 PM EDT Reads: 663
Aspose.Total for .NET is the most complete package of all file format APIs for .NET as offered by Aspose. It empowers developers to create, edit, render, print and convert between a wide range of popular document formats within any .NET, C#, ASP.NET and VB.NET applications. Aspose compiles all .NET APIs on a daily basis to ensure that it contains the most up to date versions of each of Aspose .NET APIs. If a new .NET API or a new version of existing APIs is released during the subscription peri...
Jul. 30, 2016 02:30 PM EDT Reads: 1,085
Security, data privacy, reliability, and regulatory compliance are critical factors when evaluating whether to move business applications from in-house, client-hosted environments to a cloud platform. Quality assurance plays a vital role in ensuring that the appropriate level of risk assessment, verification, and validation takes place to ensure business continuity during the migration to a new cloud platform.
Jul. 30, 2016 02:00 PM EDT Reads: 547
SYS-CON Events announced today that 910Telecom will exhibit at the 19th International Cloud Expo, which will take place on November 1–3, 2016, at the Santa Clara Convention Center in Santa Clara, CA. Housed in the classic Denver Gas & Electric Building, 910 15th St., 910Telecom is a carrier-neutral telecom hotel located in the heart of Denver. Adjacent to CenturyLink, AT&T, and Denver Main, 910Telecom offers connectivity to all major carriers, Internet service providers, Internet backbones and ...
Jul. 30, 2016 01:30 PM EDT Reads: 992
Ovum, a leading technology analyst firm, has published an in-depth report, Ovum Decision Matrix: Selecting a DevOps Release Management Solution, 2016–17. The report focuses on the automation aspects of DevOps, Release Management and compares solutions from the leading vendors.
Jul. 30, 2016 01:00 PM EDT Reads: 1,868
Continuous testing helps bridge the gap between developing quickly and maintaining high quality products. But to implement continuous testing, CTOs must take a strategic approach to building a testing infrastructure and toolset that empowers their team to move fast. Download our guide to laying the groundwork for a scalable continuous testing strategy.
Jul. 30, 2016 01:00 PM EDT Reads: 2,139
Adding public cloud resources to an existing application can be a daunting process. The tools that you currently use to manage the software and hardware outside the cloud aren’t always the best tools to efficiently grow into the cloud. All of the major configuration management tools have cloud orchestration plugins that can be leveraged, but there are also cloud-native tools that can dramatically improve the efficiency of managing your application lifecycle. In his session at 18th Cloud Expo, ...
Jul. 30, 2016 12:00 PM EDT Reads: 1,384