Click here to close now.

Welcome!

Cloud Expo Authors: Elizabeth White, Carmen Gonzalez, Vormetric Blog, Liz McMillan, Pat Romanski

Related Topics: Cloud Expo, Virtualization

Cloud Expo: Article

Next Generation Datacenters: From Legacy to the Cloud

Part 3 of 5 – The Strategy – Step 2 Alignments

This is part 3 of the continuing article "From Legacy to the Cloud: Next Generation Datacenters"

Restating the obvious - Business and IT have planned, designed and operated business systems in a silo-oriented manner with disconnected decisions and actions as to how the business needs to executes and systems are created, implemented and actually operate.

In the first two articles in this series I provided details as to the problems with legacy datacenter models & started laying out a proven method to attack the legacy problems by implementing a day in the life approach to get the truth of what is happening from the business to the datacenter and/or what decisions and actions have been taken to cause the day in the life to not execute as expected.

This article will focus on the ALIGNMENT Step of the four-step strategy for attacking legacy IT will be described.

Insight - ‘day in the life' understanding of both business and IT in terms of execution, historical decisions and actions and objective understanding of the actual situation as it exists today.

Alignment - creating a common language, taxonomy and system creation model to produce repeatable results.

Control - specific actions, tooling, process and approach to ensure the appropriate change occurs and re-occurs successfully.

Sustainment - mechanisms and processes instituted in a repeatable discipline to ensure consistent results and avoid falling back into the legacy traps.

 

Step 2 - Alignment

Recap from Step 1

In Step 1, we gathered day in the life information on an end-to-end basis.  We hit on the key linkages of a business value chain tied to an IT supply chain. By profiling, mapping, modeling and correlating in an objective data driven manner - we now have assembled the input data to our alignment-processing step.  In particular, we hit on four key actions of data collection that will not only serve in the transformation process but must become part of a sustainment process to ensure continual alignment between business and IT going forward.

The four key data principles are:

  • "Know how you make money"
  • "Know what fulfills your money making process"
  • "Know your IT supply chain components & usage"
  • "Know who & how you operate your IT supply chain"

Creating Alignment

Organizations who have been successful in migrating legacy datacenter infrastructures to next generation cloud like models have employed and adopted a critical strategy of "Systems Thinking".

Wikipedia Definition- Systems Thinking has been defined as an approach to problem solving, by viewing "problems" as parts of an overall system. In organizations, systems consist of people, structures and processes that work together to make an organization healthy or unhealthy. Systems thinking is not one thing but a set of habits or practices within a framework that is based on the belief that the component parts of a system can best understood in the context of relationships with each other and with other systems, rather than in isolation.

If we go back to the problem I covered in the first blog - most organizations (who are the ones with extreme legacy infrastructure problems) have not planned, design or created their specific applications and infrastructure systems with a strategic "System-Thinking" approach.  Systems Thinking is ‘the' key or missing component to creating a successful formula that ensures alignment between Business and IT on the journey to the next generation datacenter in the cloud. That formula is defined below.

Alignment Formula - Insight Data Points + Systems Thinking + 80/20 Rule + Capability Oriented Taxonomy.

Formula Explained

Insight Data Points - create the inputs and boundaries for decision-making. These data inputs are qualification & quantification of: Business Drivers X Business Behaviors x Key Performance Indicators by Business Function x Business Value Chain x Workload Qualities x Business Events x IT Supply Chain Qualities & Behaviors.

Systems Thinking - is the framework for the design decision-making process. Holistic end to end, top to bottom design thought process from business user to IT Rack on specific floor space with specific power distribution, cabling and cooling strategy translating to stated quality of execution required - i.e. Performance, cost, agility, availability, fungability, etc...

80/20 Rule - a critical method to get around scope creep and paralysis/analysis. With proper Insight data gathering that captures business behaviors profiled and trended correlated with IT supply chain behaviors (again think day in the life) there becomes a portfolio profile where 20% of the business functions consume close to 80% of the IT supply chain on a normalized basis.  Typically these are client functions, revenue functions, regulatory functions and business risk related functions.   Attack anything new in these functions with next generation strategy.  Quarantine any legacy that is not fitting this profile and modernize any existing systems that fit the profile.

Capabilities Oriented Taxonomy - build it, buy it, borrow it - but establish a common taxonomy that Business and IT must think and speak.  What are the business capabilities that generate revenue, the require IT supply chain capabilities to capture data, analyze or process the data, share information or content, update information or content and store the various information or content in a day in the life? This is critical to keep everyone aligned and create an environment of collaborative innovation.  (Hint - borrow from Six Sigma, LEAN, SOA, EA, ISO, etc... that takes you from user to datacenter facilities.  If you don't have a repository nearing 1000 discrete categories you probably have an incomplete taxonomy.)

Note - this is an iterative process where firms must crawl, walk, run in terms of defining and building out.  Key is to get classification, definition and behavior of speaking and thinking in this manner between Business and IT that is critical.

Tooling & Approach

This formula needs to be implemented in a structured and well-managed manner.  A transformation office should be established that incorporates Business and IT resources allocated/dedicated.

Getting started organizations need to create their library.  This library needs to incorporate the Day in the Life Modeling & Profiling, Inventory of Business Systems Deployment Attributes and Interdependencies Profiled.

Organizations need to establish a Portfolio Decision Matrix that maintains all decisions and actions agreed to business and IT when applying the Alignment Formula.

A Technology Product Catalogue needs to be created/refined in a manner that is capability oriented and gaps identified. This is the master artifact for which the IT Supply Chain will be transformed against/measured against/benchmarked against.

A roadmap needs to be created of current state to target state systems & supply chain evolution by capability.  When creating think of peeling an onion.  Each layer has a purpose.  In this case, each layer of the roadmap needs to communicate to each audience member.

Day in the Life Modeling needs to be translated into various Use Cases (Business functionality to technical enablement).  These use cases need to be mapped to patterns (business and data patterns to application and workload patterns). The patterns then need to map to capability deployment strategies (how logical patterns will manifest themselves into physical realities).  The key here is to use this modeling & translation effort as your CHANGE VEHICLE that ensures actions realize the alignment decisions made.

Next organizations must establish an engineering deployment model discipline.  Here the translated output defined in the day in the life modeling above needs to be incorporated into IT supply chain engineering deployment that is linked to automation assembly and discrete policies for enactment.

The final tooling and process component is a communication strategy that incorporates methods, frequencies and playbooks tuned to your specific organization learning culture. Here, you must figure out how to use concepts like town-halls, executive briefings, status/progress reports, decision arbitration board, and detailed playbooks (think yellow book guide for dummies) that outline the steps being taken, decisions made/to be made, achievements accomplished and why things are being done or not.

If you are still reading at this point - you are ready to transform!  The Next step that will be covered in the next article is to outline the processes and tooling is necessary to "control" and effect the change to your next generation datacenter.

More Stories By Tony Bishop

Blueprint4IT is authored by a longtime IT and Datacenter Technologist. Author of Next Generation Datacenters in Financial Services – Driving Extreme Efficiency and Effective Cost Savings. A former technology executive for both Morgan Stanley and Wachovia Securities.

Comments (0)

Share your thoughts on this story.

Add your comment
You must be signed in to add a comment. Sign-in | Register

In accordance with our Comment Policy, we encourage comments that are on topic, relevant and to-the-point. We will remove comments that include profanity, personal attacks, racial slurs, threats of violence, or other inappropriate material that violates our Terms and Conditions, and will block users who make repeated violations. We ask all readers to expect diversity of opinion and to treat one another with dignity and respect.


@CloudExpo Stories
In his session at DevOps Summit, Tapabrata Pal, Director of Enterprise Architecture at Capital One, will tell a story about how Capital One has embraced Agile and DevOps Security practices across the Enterprise – driven by Enterprise Architecture; bringing in Development, Operations and Information Security organizations together. Capital Ones DevOpsSec practice is based upon three "pillars" – Shift-Left, Automate Everything, Dashboard Everything. Within about three years, from 100% waterfall, C...
Disruptive macro trends in technology are impacting and dramatically changing the "art of the possible" relative to supply chain management practices through the innovative use of IoT, cloud, machine learning and Big Data to enable connected ecosystems of engagement. Enterprise informatics can now move beyond point solutions that merely monitor the past and implement integrated enterprise fabrics that enable end-to-end supply chain visibility to improve customer service delivery and optimize sup...
Dale Kim is the Director of Industry Solutions at MapR. His background includes a variety of technical and management roles at information technology companies. While his experience includes work with relational databases, much of his career pertains to non-relational data in the areas of search, content management, and NoSQL, and includes senior roles in technical marketing, sales engineering, and support engineering. Dale holds an MBA from Santa Clara University, and a BA in Computer Science f...
Red Hat has launched the Red Hat Cloud Innovation Practice, a new global team of experts that will assist companies with more quickly on-ramping to the cloud. They will do this by providing solutions and services such as validated designs with reference architectures and agile methodology consulting, training, and support. The Red Hat Cloud Innovation Practice is born out of the integration of technology and engineering expertise gained through the company’s 2014 acquisitions of leading Ceph s...
Docker has acquired software-defined networking (SDN) startup SocketPlane. SocketPlane, which was founded in Q4, 2014, with a vision of delivering Docker-native networking, has been an active participant in shaping the initial efforts around Docker’s open API for networking. The explicit focus of the SocketPlane team within Docker will be on collaborating with the partner community to complete a rich set of networking APIs that addresses the needs of application developers and network and system...
It’s been proven time and time again that in tech, diversity drives greater innovation, better team productivity and greater profits and market share. So what can we do in our DevOps teams to embrace diversity and help transform the culture of development and operations into a true “DevOps” team? In her session at DevOps Summit, Stefana Muller, Director, Product Management – Continuous Delivery at CA Technologies, will answer that question citing examples, showing how to create opportunities f...
Even as cloud and managed services grow increasingly central to business strategy and performance, challenges remain. The biggest sticking point for companies seeking to capitalize on the cloud is data security. Keeping data safe is an issue in any computing environment, and it has been a focus since the earliest days of the cloud revolution. Understandably so: a lot can go wrong when you allow valuable information to live outside the firewall. Recent revelations about government snooping, along...
The cloud is now a fact of life but generating recurring revenues that are driven by solutions and services on a consumption model have been hard to implement, until now. In their session at 16th Cloud Expo, Ermanno Bonifazi, CEO & Founder of Solgenia, and Ian Khan, Global Strategic Positioning & Brand Manager at Solgenia, will discuss how a top European telco has leveraged the innovative recurring revenue generating capability of the consumption cloud to enable a unique cloud monetization mod...
FedRAMP is mandatory for government cloud deployments and businesses need to comply in order to provide services for federal engagements. In his session at 16th Cloud Expo, Abel Sussman, Director for Coalfire Public Sector practice, will review the Federal Risk and Authorization Management Program (FedRAMP) process and provide advice on overcoming common compliance obstacles.
As organizations shift toward IT-as-a-service models, the need for managing and protecting data residing across physical, virtual, and now cloud environments grows with it. CommVault can ensure protection &E-Discovery of your data – whether in a private cloud, a Service Provider delivered public cloud, or a hybrid cloud environment – across the heterogeneous enterprise. In his session at 16th Cloud Expo, Randy De Meno, Chief Technologist - Windows Products and Microsoft Partnerships, will disc...
Are your applications getting in the way of your business strategy? It’s time to rethink your IT approach. In his session at 16th Cloud Expo, Madhukar Kumar, Vice President, Product Management at Liaison Technologies, will discuss a new data-centric approach to IT that allows your data, not applications, to inform business strategy. By moving away from an application-centric IT model where data integration and analysis are subservient to the constraints of applications, your organization will b...
Analytics is the foundation of smart data and now, with the ability to run Hadoop directly on smart storage systems like Cloudian HyperStore, enterprises will gain huge business advantages in terms of scalability, efficiency and cost savings as they move closer to realizing the potential of the Internet of Things. In his session at 16th Cloud Expo, Paul Turner, technology evangelist and CMO at Cloudian, Inc., will discuss the revolutionary notion that the storage world is transitioning from me...
VictorOps is making on-call suck less with the only collaborative alert management platform on the market. With easy on-call scheduling management, a real-time incident timeline that gives you contextual relevance around your alerts and powerful reporting features that make post-mortems more effective, VictorOps helps your IT/DevOps team solve problems faster.
The Software Defined Data Center (SDDC), which enables organizations to seamlessly run in a hybrid cloud model (public + private cloud), is here to stay. IDC estimates that the software-defined networking market will be valued at $3.7 billion by 2016. Security is a key component and benefit of the SDDC, and offers an opportunity to build security 'from the ground up' and weave it into the environment from day one. In his session at 16th Cloud Expo, Reuven Harrison, CTO and Co-Founder of Tufin,...
The essence of cloud computing is that all consumable IT resources are delivered as services. In his session at 15th Cloud Expo, Yung Chou, Technology Evangelist at Microsoft, will demonstrate the concepts and implementations of two important cloud computing deliveries: Infrastructure as a Service (IaaS) and Platform as a Service (PaaS). He will discuss from business and technical viewpoints what exactly they are, why we care, how they are different and in what ways, and the strategies for IT ...
Cloud data governance was previously an avoided function when cloud deployments were relatively small. With the rapid adoption in public cloud – both rogue and sanctioned, it’s not uncommon to find regulated data dumped into public cloud and unprotected. This is why enterprises and cloud providers alike need to embrace a cloud data governance function and map policies, processes and technology controls accordingly. In her session at 15th Cloud Expo, Evelyn de Souza, Data Privacy and Compliance...
The free version of KEMP Technologies' LoadMaster™ application load balancer is now available for unlimited use, making it easy for IT developers and open source technology users to benefit from all the features of a full commercial-grade product at no cost. It can be downloaded at FreeLoadBalancer.com. Load balancing, security and traffic optimization are all key enablers for application performance and functionality. Without these, application services will not perform as expected or have the...
There are many considerations when moving applications from on-premise to cloud. It is critical to understand the benefits and also challenges of this migration. A successful migration will result in lower Total Cost of Ownership, yet offer the same or higher level of robustness. In his session at 15th Cloud Expo, Michael Meiner, an Engineering Director at Oracle, Corporation, will analyze a range of cloud offerings (IaaS, PaaS, SaaS) and discuss the benefits/challenges of migrating to each of...
Platform-as-a-Service (PaaS) is a technology designed to make DevOps easier and allow developers to focus on application development. The PaaS takes care of provisioning, scaling, HA, and other cloud management aspects. Apache Stratos is a PaaS codebase developed in Apache and designed to create a highly productive developer environment while also supporting powerful deployment options. Integration with the Docker platform, CoreOS Linux distribution, and Kubernetes container management system ...
Skeuomorphism usually means retaining existing design cues in something new that doesn’t actually need them. However, the concept of skeuomorphism can be thought of as relating more broadly to applying existing patterns to new technologies that, in fact, cry out for new approaches. In his session at DevOps Summit, Gordon Haff, Senior Cloud Strategy Marketing and Evangelism Manager at Red Hat, will discuss why containers should be paired with new architectural practices such as microservices ra...