Welcome!

@CloudExpo Authors: Elizabeth White, Yeshim Deniz, Zakia Bouachraoui, Pat Romanski, Liz McMillan

Related Topics: @CloudExpo, Containers Expo Blog

@CloudExpo: Article

From Legacy to the Cloud: Next Generation Datacenters

Part 1 of 5 – The problem defined

This is the introduction to a five part series focusing on identifying the problem and sharing proven strategies for transforming legacy datacenters and infrastructure to next generation datacenters that operate in a cloud computing like model for enterprise IT.

The article builds on concepts I introduced in the book I published in June of 2009, “Next Generation Datacenters in Financial Services (Driving Extreme Efficiency and Effective Cost Savings.” (Elsevier). It is a continuum of thoughts I have shared in the past along with influences of recent dialogues with technology executives and the problems they continue to face.


The Problem

One of the fundamental barriers to business today is the intersection of the “day in the life” of the business and how that executes across the IT supply chain. The barriers caused by this Business-IT Chasm, include limits placed on customer interaction, ineffective & inconsistent decision-making, time to react to market changes and the cost to do business.

This typically results across more organizations because of disconnected decisions and actions made by both business and IT.

The business can be faulted for not providing the insight into how their business operates in a day in the life on an end-to-end process and execution basis. Instead each business function drives their application and IT folks to build systems focused only on their area of responsibility creating a in complete view of business execution necessary for optimal system development.

The IT organizations take this siloed input from the business and expand on the concept in terms of applications, data repositories and infrastructure systems. Both a top down and bottom up conflict occurs. Application and data folks build their own vertical oriented capabilities. The infrastructure folks define and provision infrastructure based on their datacenter strategy of compute, network, storage, placement, power, cooling and physical characteristics. This amplifies the silo disconnect resulting in waste, quality of delivery barriers, cost limitations and creates operational risk.

Take this scenario described above and multiply it across the various business functions, business processes, business channels, products and services and you get an input factor of many. Take this input factor and implement it into the Business to IT factory conversion process and the output result is business limiting system caused by a ineffective and poorly aligned system creation process.

Additional factors facing organizations include a problem of legacy technology that has limitations ranging from manageability, integration, efficiency, costs, etc… On top of this, most organizations have not implemented the discipline to document and maintain accurate understanding of their systems and how they execute on a daily basis. Then you have the “pain avoidance” strategy where firms have pushed off upgrades and enhancements – resulting in unsupported systems that creates even greater risk and limits the change strategies firm may wish to implement.

There is not silver bullet or single strategy that solves all. There is however, proven methods and successful strategies that multiple firms have employed to attack the legacy problems in their existing datacenter infrastructures and iterate their way towards next generation datacenters and IT delivery ala Cloud Computing Models.

The Strategy

In parts 2 thru 5 of this series, a four step strategy for attacking legacy IT will be described. These steps can be summarized as:

  • Insight – day in the life understanding of both business and IT in terms of execution, historical decisions and actions and objective understanding of the actual situation as it exists today.
  • Alignment – creating a common language, taxonomy and system creation model to produce repeatable results.
  • Control – specific actions, tooling, process and approach to ensure the appropriate change occurs and re-occurs successfully.
  • Sustainment – mechanisms and processes instituted in a repeatable discipline to ensure consistent results and avoid falling back into the legacy traps.

More Stories By Tony Bishop

Blueprint4IT is authored by a longtime IT and Datacenter Technologist. Author of Next Generation Datacenters in Financial Services – Driving Extreme Efficiency and Effective Cost Savings. A former technology executive for both Morgan Stanley and Wachovia Securities.

Comments (0)

Share your thoughts on this story.

Add your comment
You must be signed in to add a comment. Sign-in | Register

In accordance with our Comment Policy, we encourage comments that are on topic, relevant and to-the-point. We will remove comments that include profanity, personal attacks, racial slurs, threats of violence, or other inappropriate material that violates our Terms and Conditions, and will block users who make repeated violations. We ask all readers to expect diversity of opinion and to treat one another with dignity and respect.


CloudEXPO Stories
While a hybrid cloud can ease that transition, designing and deploy that hybrid cloud still offers challenges for organizations concerned about lack of available cloud skillsets within their organization. Managed service providers offer a unique opportunity to fill those gaps and get organizations of all sizes on a hybrid cloud that meets their comfort level, while delivering enhanced benefits for cost, efficiency, agility, mobility, and elasticity.
Isomorphic Software is the global leader in high-end, web-based business applications. We develop, market, and support the SmartClient & Smart GWT HTML5/Ajax platform, combining the productivity and performance of traditional desktop software with the simplicity and reach of the open web. With staff in 10 timezones, Isomorphic provides a global network of services related to our technology, with offerings ranging from turnkey application development to SLA-backed enterprise support. Leading global enterprises use Isomorphic technology to reduce costs and improve productivity, developing & deploying sophisticated business applications with unprecedented ease and simplicity.
DevOps has long focused on reinventing the SDLC (e.g. with CI/CD, ARA, pipeline automation etc.), while reinvention of IT Ops has lagged. However, new approaches like Site Reliability Engineering, Observability, Containerization, Operations Analytics, and ML/AI are driving a resurgence of IT Ops. In this session our expert panel will focus on how these new ideas are [putting the Ops back in DevOps orbringing modern IT Ops to DevOps].
Darktrace is the world's leading AI company for cyber security. Created by mathematicians from the University of Cambridge, Darktrace's Enterprise Immune System is the first non-consumer application of machine learning to work at scale, across all network types, from physical, virtualized, and cloud, through to IoT and industrial control systems. Installed as a self-configuring cyber defense platform, Darktrace continuously learns what is ‘normal' for all devices and users, updating its understanding as the environment changes.
Enterprises are striving to become digital businesses for differentiated innovation and customer-centricity. Traditionally, they focused on digitizing processes and paper workflow. To be a disruptor and compete against new players, they need to gain insight into business data and innovate at scale. Cloud and cognitive technologies can help them leverage hidden data in SAP/ERP systems to fuel their businesses to accelerate digital transformation success.