Welcome!

@CloudExpo Authors: Yeshim Deniz, Kevin Benedict, Pat Romanski, Liz McMillan, Dana Gardner

Related Topics: @CloudExpo, Containers Expo Blog

@CloudExpo: Article

Next Generation Datacenters: From Legacy to the Cloud

Part 2 of 5 – The Strategy – Step 1 Insight

This is part 2 of the continuing article series “From Legacy to the Cloud: Next Generation Datacenters”

Restating the obvious: Business and IT have planned, designed and operated business systems in a silo-oriented manner with disconnected decisions and actions as to how the business needs to executes and systems are created.

In the first article in this series I stated that there is not silver bullet or single strategy that solves all. There is however, proven methods and successful strategies that multiple firms have employed to attack the legacy problems in their existing datacenter infrastructures and iterate their way towards next generation datacenters and IT delivery ala Cloud Computing Models.

There is a four-step strategy for attacking legacy IT will be described. These steps can be summarized as:

  • Insight – ‘day in the life’ understanding of both business and IT in terms of execution, historical decisions and actions and objective understanding of the actual situation as it exists today.
  • Alignment – creating a common language, taxonomy and system creation model to produce repeatable results.
  • Control – specific actions, tooling, process and approach to ensure the appropriate change occurs and re-occurs successfully.
  • Sustainment – mechanisms and processes instituted in a repeatable discipline to ensure consistent results and avoid falling back into the legacy traps.

Step 1 – Insight

The most imposing barrier facing Business and IT organizations today is the lack of comprehensive & holistic insight into how the Business executes across the IT Supply Chain.

An effective strategy to create a common dialogue focus is a system thinking process of “day in a life”. How does the business originate, service and fulfill the generation of revenue and servicing of customers. Dimensions of the dialogue need to incorporate the definition of the value chain of the business (read Michael Porter) and how various business processes transcend across this chain in a typically daily process. Variations such a business calendar events, client behavior demographics, demographic coverage and wall clock changes must be identified and understood. As these points of view and understanding are gathered it is essential to ensure that they are captured along with business driver linkages, key performance indicator identification and any associated perceived deficiencies. Ensure all of this is accurately documented, reviewed and signed-off by the business to help prioritize decision-making and support strategy creation.

Let's call this “Know how you make money”

Now the key is to NOT stop there. The Business and IT need to then understand and correlate the runtime execution of the business processes across the IT Supply Chain. This means how does transactions, service requests, fulfillment requests, marketing actions, content sharing, online collaboration, financial & compliance reporting actually occur across the various components and processes of the IT systems located in their datacenters.

So there is a parallel day in the life execution model of IT that Business and IT must mutually understand.

Let's call this “Know what fulfills your money making process”

Once the day in the life framework is defined and documented it becomes critical that firms start to gather objective data on usage demographics (business and users using systems), performance demographics (how user experience performs across the systems and data centers), economic demographics (which users on what systems consume what supply in what trended manner against what asset allocation/depreciation/acquisition strategy), system demographics (inventory, composition, inter-dependency, fungability) IT Standards demographics (application patterns standards, infrastructure pattern standards, deployment model standards and systems management & monitoring practices. This should be captured in a some kind of portfolio management system linking demand and supply behavioral models that have been captured through the process defined above.

Let's call this “Know your supply chain components & usage”

The next part of the day in the life of the IT side of the house is conducting a capability assessment of the people and associated processes to build the supply chain, run the supply chain and govern the supply chain. Here it is critical to profile, analyze, assess and score on a competency, maturity and alignment basis. Information gathered and scored here should include skills inventory, planning & design processes, build and operate processes, govern and optimize processes. It is critical to get this current state documented and digitized to support critical decision-making & strategy planning.

Let's call this “Know who & how you operate your supply chain”

Summary

To some of the readers, the step above is a no-brainer. Others may believe it is unnecessary or not an effective investment of time. In either case, proven strategies and successful IT organizations have a very strong discipline and understanding of current state. It is the organizations that link business value creation with demand prioritization and profiling tied to IT capabilities that will create, implement and operationalize successful transformation strategies of their business and IT operations.

In part 3, we will explore how to apply the insight captured by following part 2 of this blog. Here decision-making and execution actions are dependent on the quality and accuracy of the input information gathered in the Insight described above.

More Stories By Tony Bishop

Blueprint4IT is authored by a longtime IT and Datacenter Technologist. Author of Next Generation Datacenters in Financial Services – Driving Extreme Efficiency and Effective Cost Savings. A former technology executive for both Morgan Stanley and Wachovia Securities.

Comments (0)

Share your thoughts on this story.

Add your comment
You must be signed in to add a comment. Sign-in | Register

In accordance with our Comment Policy, we encourage comments that are on topic, relevant and to-the-point. We will remove comments that include profanity, personal attacks, racial slurs, threats of violence, or other inappropriate material that violates our Terms and Conditions, and will block users who make repeated violations. We ask all readers to expect diversity of opinion and to treat one another with dignity and respect.


@CloudExpo Stories
SYS-CON Events announced today that Interoute, owner-operator of one of Europe's largest networks and a global cloud services platform, has been named “Bronze Sponsor” of SYS-CON's 20th Cloud Expo, which will take place on June 6-8, 2017 at the Javits Center in New York, New York. Interoute is the owner-operator of one of Europe's largest networks and a global cloud services platform which encompasses 12 data centers, 14 virtual data centers and 31 colocation centers, with connections to 195 add...
Building custom add-ons does not need to be limited to the ideas you see on a marketplace. In his session at 20th Cloud Expo, Sukhbir Dhillon, CEO and founder of Addteq, will go over some adventures they faced in developing integrations using Atlassian SDK and other technologies/platforms and how it has enabled development teams to experiment with newer paradigms like Serverless and newer features of Atlassian SDKs. In this presentation, you will be taken on a journey of Add-On and Integration ...
There are 66 million network cameras capturing terabytes of data. How did factories in Japan improve physical security at the facilities and improve employee productivity? Edge Computing reduces possible kilobytes of data collected per second to only a few kilobytes of data transmitted to the public cloud every day. Data is aggregated and analyzed close to sensors so only intelligent results need to be transmitted to the cloud. Non-essential data is recycled to optimize storage.
"I think that everyone recognizes that for IoT to really realize its full potential and value that it is about creating ecosystems and marketplaces and that no single vendor is able to support what is required," explained Esmeralda Swartz, VP, Marketing Enterprise and Cloud at Ericsson, in this SYS-CON.tv interview at @ThingsExpo, held June 7-9, 2016, at the Javits Center in New York City, NY.
Interoute has announced the integration of its Global Cloud Infrastructure platform with Rancher Labs’ container management platform, Rancher. This approach enables enterprises to accelerate their digital transformation and infrastructure investments. Matthew Finnie, Interoute CTO commented “Enterprises developing and building apps in the cloud and those on a path to Digital Transformation need Digital ICT Infrastructure that allows them to build, test and deploy faster than ever before. The int...
Culture is the most important ingredient of DevOps. The challenge for most organizations is defining and communicating a vision of beneficial DevOps culture for their organizations, and then facilitating the changes needed to achieve that. Often this comes down to an ability to provide true leadership. As a CIO, are your direct reports IT managers or are they IT leaders? The hard truth is that many IT managers have risen through the ranks based on their technical skills, not their leadership abi...
The essence of cloud computing is that all consumable IT resources are delivered as services. In his session at 15th Cloud Expo, Yung Chou, Technology Evangelist at Microsoft, demonstrated the concepts and implementations of two important cloud computing deliveries: Infrastructure as a Service (IaaS) and Platform as a Service (PaaS). He discussed from business and technical viewpoints what exactly they are, why we care, how they are different and in what ways, and the strategies for IT to transi...
Without a clear strategy for cost control and an architecture designed with cloud services in mind, costs and operational performance can quickly get out of control. To avoid multiple architectural redesigns requires extensive thought and planning. Boundary (now part of BMC) launched a new public-facing multi-tenant high resolution monitoring service on Amazon AWS two years ago, facing challenges and learning best practices in the early days of the new service.
Niagara Networks exhibited at the 19th International Cloud Expo, which took place at the Santa Clara Convention Center in Santa Clara, CA, in November 2016. Niagara Networks offers the highest port-density systems, and the most complete Next-Generation Network Visibility systems including Network Packet Brokers, Bypass Switches, and Network TAPs.
SYS-CON Events announced today that Outlyer, a monitoring service for DevOps and operations teams, has been named “Bronze Sponsor” of SYS-CON's 20th International Cloud Expo®, which will take place on June 6-8, 2017, at the Javits Center in New York City, NY. Outlyer is a monitoring service for DevOps and Operations teams running Cloud, SaaS, Microservices and IoT deployments. Designed for today's dynamic environments that need beyond cloud-scale monitoring, we make monitoring effortless so you ...
My team embarked on building a data lake for our sales and marketing data to better understand customer journeys. This required building a hybrid data pipeline to connect our cloud CRM with the new Hadoop Data Lake. One challenge is that IT was not in a position to provide support until we proved value and marketing did not have the experience, so we embarked on the journey ourselves within the product marketing team for our line of business within Progress. In his session at @BigDataExpo, Sum...
Keeping pace with advancements in software delivery processes and tooling is taxing even for the most proficient organizations. Point tools, platforms, open source and the increasing adoption of private and public cloud services requires strong engineering rigor - all in the face of developer demands to use the tools of choice. As Agile has settled in as a mainstream practice, now DevOps has emerged as the next wave to improve software delivery speed and output. To make DevOps work, organization...
DevOps is often described as a combination of technology and culture. Without both, DevOps isn't complete. However, applying the culture to outdated technology is a recipe for disaster; as response times grow and connections between teams are delayed by technology, the culture will die. A Nutanix Enterprise Cloud has many benefits that provide the needed base for a true DevOps paradigm.
China Unicom exhibit at the 19th International Cloud Expo, which took place at the Santa Clara Convention Center in Santa Clara, CA, in November 2016. China United Network Communications Group Co. Ltd ("China Unicom") was officially established in 2009 on the basis of the merger of former China Netcom and former China Unicom. China Unicom mainly operates a full range of telecommunications services including mobile broadband (GSM, WCDMA, LTE FDD, TD-LTE), fixed-line broadband, ICT, data communica...
As software becomes more and more complex, we, as software developers, have been splitting up our code into smaller and smaller components. This is also true for the environment in which we run our code: going from bare metal, to VMs to the modern-day Cloud Native world of containers, schedulers and micro services. While we have figured out how to run containerized applications in the cloud using schedulers, we've yet to come up with a good solution to bridge the gap between getting your contain...
Apache Hadoop is emerging as a distributed platform for handling large and fast incoming streams of data. Predictive maintenance, supply chain optimization, and Internet-of-Things analysis are examples where Hadoop provides the scalable storage, processing, and analytics platform to gain meaningful insights from granular data that is typically only valuable from a large-scale, aggregate view. One architecture useful for capturing and analyzing streaming data is the Lambda Architecture, represent...
As organizations realize the scope of the Internet of Things, gaining key insights from Big Data, through the use of advanced analytics, becomes crucial. However, IoT also creates the need for petabyte scale storage of data from millions of devices. A new type of Storage is required which seamlessly integrates robust data analytics with massive scale. These storage systems will act as “smart systems” provide in-place analytics that speed discovery and enable businesses to quickly derive meaningf...
Your homes and cars can be automated and self-serviced. Why can't your storage? From simply asking questions to analyze and troubleshoot your infrastructure, to provisioning storage with snapshots, recovery and replication, your wildest sci-fi dream has come true. In his session at @DevOpsSummit at 20th Cloud Expo, Dan Florea, Director of Product Management at Tintri, will provide a ChatOps demo where you can talk to your storage and manage it from anywhere, through Slack and similar services ...
SYS-CON Events announced today that Ocean9will exhibit at SYS-CON's 20th International Cloud Expo®, which will take place on June 6-8, 2017, at the Javits Center in New York City, NY. Ocean9 provides cloud services for Backup, Disaster Recovery (DRaaS) and instant Innovation, and redefines enterprise infrastructure with its cloud native subscription offerings for mission critical SAP workloads.
Building a cross-cloud operational model can be a daunting task. Per-cloud silos are not the answer, but neither is a fully generic abstraction plane that strips out capabilities unique to a particular provider. In his session at 20th Cloud Expo, Chris Wolf, VP & Chief Technology Officer, Global Field & Industry at VMware, will discuss how successful organizations approach cloud operations and management, with insights into where operations should be centralized and when it’s best to decentraliz...