Click here to close now.


@CloudExpo Authors: Elizabeth White, Liz McMillan, Pat Romanski, Gregor Petri, Ed Featherston

Related Topics: @CloudExpo, Containers Expo Blog

@CloudExpo: Article

Next Generation Datacenters: From Legacy to the Cloud

Part 2 of 5 – The Strategy – Step 1 Insight

This is part 2 of the continuing article series “From Legacy to the Cloud: Next Generation Datacenters”

Restating the obvious: Business and IT have planned, designed and operated business systems in a silo-oriented manner with disconnected decisions and actions as to how the business needs to executes and systems are created.

In the first article in this series I stated that there is not silver bullet or single strategy that solves all. There is however, proven methods and successful strategies that multiple firms have employed to attack the legacy problems in their existing datacenter infrastructures and iterate their way towards next generation datacenters and IT delivery ala Cloud Computing Models.

There is a four-step strategy for attacking legacy IT will be described. These steps can be summarized as:

  • Insight – ‘day in the life’ understanding of both business and IT in terms of execution, historical decisions and actions and objective understanding of the actual situation as it exists today.
  • Alignment – creating a common language, taxonomy and system creation model to produce repeatable results.
  • Control – specific actions, tooling, process and approach to ensure the appropriate change occurs and re-occurs successfully.
  • Sustainment – mechanisms and processes instituted in a repeatable discipline to ensure consistent results and avoid falling back into the legacy traps.

Step 1 – Insight

The most imposing barrier facing Business and IT organizations today is the lack of comprehensive & holistic insight into how the Business executes across the IT Supply Chain.

An effective strategy to create a common dialogue focus is a system thinking process of “day in a life”. How does the business originate, service and fulfill the generation of revenue and servicing of customers. Dimensions of the dialogue need to incorporate the definition of the value chain of the business (read Michael Porter) and how various business processes transcend across this chain in a typically daily process. Variations such a business calendar events, client behavior demographics, demographic coverage and wall clock changes must be identified and understood. As these points of view and understanding are gathered it is essential to ensure that they are captured along with business driver linkages, key performance indicator identification and any associated perceived deficiencies. Ensure all of this is accurately documented, reviewed and signed-off by the business to help prioritize decision-making and support strategy creation.

Let's call this “Know how you make money”

Now the key is to NOT stop there. The Business and IT need to then understand and correlate the runtime execution of the business processes across the IT Supply Chain. This means how does transactions, service requests, fulfillment requests, marketing actions, content sharing, online collaboration, financial & compliance reporting actually occur across the various components and processes of the IT systems located in their datacenters.

So there is a parallel day in the life execution model of IT that Business and IT must mutually understand.

Let's call this “Know what fulfills your money making process”

Once the day in the life framework is defined and documented it becomes critical that firms start to gather objective data on usage demographics (business and users using systems), performance demographics (how user experience performs across the systems and data centers), economic demographics (which users on what systems consume what supply in what trended manner against what asset allocation/depreciation/acquisition strategy), system demographics (inventory, composition, inter-dependency, fungability) IT Standards demographics (application patterns standards, infrastructure pattern standards, deployment model standards and systems management & monitoring practices. This should be captured in a some kind of portfolio management system linking demand and supply behavioral models that have been captured through the process defined above.

Let's call this “Know your supply chain components & usage”

The next part of the day in the life of the IT side of the house is conducting a capability assessment of the people and associated processes to build the supply chain, run the supply chain and govern the supply chain. Here it is critical to profile, analyze, assess and score on a competency, maturity and alignment basis. Information gathered and scored here should include skills inventory, planning & design processes, build and operate processes, govern and optimize processes. It is critical to get this current state documented and digitized to support critical decision-making & strategy planning.

Let's call this “Know who & how you operate your supply chain”


To some of the readers, the step above is a no-brainer. Others may believe it is unnecessary or not an effective investment of time. In either case, proven strategies and successful IT organizations have a very strong discipline and understanding of current state. It is the organizations that link business value creation with demand prioritization and profiling tied to IT capabilities that will create, implement and operationalize successful transformation strategies of their business and IT operations.

In part 3, we will explore how to apply the insight captured by following part 2 of this blog. Here decision-making and execution actions are dependent on the quality and accuracy of the input information gathered in the Insight described above.

More Stories By Tony Bishop

Blueprint4IT is authored by a longtime IT and Datacenter Technologist. Author of Next Generation Datacenters in Financial Services – Driving Extreme Efficiency and Effective Cost Savings. A former technology executive for both Morgan Stanley and Wachovia Securities.

Comments (0)

Share your thoughts on this story.

Add your comment
You must be signed in to add a comment. Sign-in | Register

In accordance with our Comment Policy, we encourage comments that are on topic, relevant and to-the-point. We will remove comments that include profanity, personal attacks, racial slurs, threats of violence, or other inappropriate material that violates our Terms and Conditions, and will block users who make repeated violations. We ask all readers to expect diversity of opinion and to treat one another with dignity and respect.

@CloudExpo Stories
The buzz continues for cloud, data analytics and the Internet of Things (IoT) and their collective impact across all industries. But a new conversation is emerging - how do companies use industry disruption and technology enablers to lead in markets undergoing change, uncertainty and ambiguity? Organizations of all sizes need to evolve and transform, often under massive pressure, as industry lines blur and merge and traditional business models are assaulted and turned upside down. In this new da...
The Internet of Things (IoT) is growing rapidly by extending current technologies, products and networks. By 2020, Cisco estimates there will be 50 billion connected devices. Gartner has forecast revenues of over $300 billion, just to IoT suppliers. Now is the time to figure out how you’ll make money – not just create innovative products. With hundreds of new products and companies jumping into the IoT fray every month, there’s no shortage of innovation. Despite this, McKinsey/VisionMobile data...
There are so many tools and techniques for data analytics that even for a data scientist the choices, possible systems, and even the types of data can be daunting. In his session at @ThingsExpo, Chris Harrold, Global CTO for Big Data Solutions for EMC Corporation, will show how to perform a simple, but meaningful analysis of social sentiment data using freely available tools that take only minutes to download and install. Participants will get the download information, scripts, and complete en...
Containers are changing the security landscape for software development and deployment. As with any security solutions, security approaches that work for developers, operations personnel and security professionals is a requirement. In his session at @DevOpsSummit, Kevin Gilpin, CTO and Co-Founder of Conjur, will discuss various security considerations for container-based infrastructure and related DevOps workflows.
As-a-service models offer huge opportunities, but also complicate security. It may seem that the easiest way to migrate to a new architectural model is to let others, experts in their field, do the work. This has given rise to many as-a-service models throughout the industry and across the entire technology stack, from software to infrastructure. While this has unlocked huge opportunities to accelerate the deployment of new capabilities or increase economic efficiencies within an organization, i...
Containers are revolutionizing the way we deploy and maintain our infrastructures, but monitoring and troubleshooting in a containerized environment can still be painful and impractical. Understanding even basic resource usage is difficult - let alone tracking network connections or malicious activity. In his session at DevOps Summit, Gianluca Borello, Sr. Software Engineer at Sysdig, will cover the current state of the art for container monitoring and visibility, including pros / cons and li...
Between the compelling mockups and specs produced by analysts, and resulting applications built by developers, there exists a gulf where projects fail, costs spiral, and applications disappoint. Methodologies like Agile attempt to address this with intensified communication, with partial success but many limitations. In his session at DevOps Summit, Charles Kendrick, CTO and Chief Architect at Isomorphic Software, will present a revolutionary model enabled by new technologies. Learn how busine...
IT data is typically silo'd by the various tools in place. Unifying all the log, metric and event data in one analytics platform stops finger pointing and provides the end-to-end correlation. Logs, metrics and custom event data can be joined to tell the holistic story of your software and operations. For example, users can correlate code deploys to system performance to application error codes.
There are many considerations when moving applications from on-premise to cloud. It is critical to understand the benefits and also challenges of this migration. A successful migration will result in lower Total Cost of Ownership, yet offer the same or higher level of robustness. Migration to cloud shifts computing resources from your data center, which can yield significant advantages provided that the cloud vendor an offer enterprise-grade quality for your application.
Internet of Things (IoT) will be a hybrid ecosystem of diverse devices and sensors collaborating with operational and enterprise systems to create the next big application. In their session at @ThingsExpo, Bramh Gupta, founder and CEO of, and Fred Yatzeck, principal architect leading product development at, discussed how choosing the right middleware and integration strategy from the get-go will enable IoT solution developers to adapt and grow with the industry, while at th...
Manufacturing has widely adopted standardized and automated processes to create designs, build them, and maintain them through their life cycle. However, many modern manufacturing systems go beyond mechanized workflows to introduce empowered workers, flexible collaboration, and rapid iteration. Such behaviors also characterize open source software development and are at the heart of DevOps culture, processes, and tooling.
The web app is agile. The REST API is agile. The testing and planning are agile. But alas, data infrastructures certainly are not. Once an application matures, changing the shape or indexing scheme of data often forces at best a top down planning exercise and at worst includes schema changes that force downtime. The time has come for a new approach that fundamentally advances the agility of distributed data infrastructures. Come learn about a new solution to the problems faced by software organ...
The last decade was about virtual machines, but the next one is about containers. Containers enable a service to run on any host at any time. Traditional tools are starting to show cracks because they were not designed for this level of application portability. Now is the time to look at new ways to deploy and manage applications at scale. In his session at @DevOpsSummit, Brian “Redbeard” Harrington, a principal architect at CoreOS, will examine how CoreOS helps teams run in production. Attende...
“All our customers are looking at the cloud ecosystem as an important part of their overall product strategy. Some see it evolve as a multi-cloud / hybrid cloud strategy, while others are embracing all forms of cloud offerings like PaaS, IaaS and SaaS in their solutions,” noted Suhas Joshi, Vice President – Technology, at Harbinger Group, in this exclusive Q&A with Cloud Expo Conference Chair Roger Strukhoff.
Can call centers hang up the phones for good? Intuitive Solutions did. WebRTC enabled this contact center provider to eliminate antiquated telephony and desktop phone infrastructure with a pure web-based solution, allowing them to expand beyond brick-and-mortar confines to a home-based agent model. It also ensured scalability and better service for customers, including MUY! Companies, one of the country's largest franchise restaurant companies with 232 Pizza Hut locations. This is one example of...
Cloud computing delivers on-demand resources that provide businesses with flexibility and cost-savings. The challenge in moving workloads to the cloud has been the cost and complexity of ensuring the initial and ongoing security and regulatory (PCI, HIPAA, FFIEC) compliance across private and public clouds. Manual security compliance is slow, prone to human error, and represents over 50% of the cost of managing cloud applications. Determining how to automate cloud security compliance is critical...
SYS-CON Events announced today that VividCortex, the monitoring solution for the modern data system, will exhibit at the 17th International Cloud Expo®, which will take place on November 3–5, 2015, at the Santa Clara Convention Center in Santa Clara, CA. The database is the heart of most applications, but it’s also the part that’s hardest to scale, monitor, and optimize even as it’s growing 50% year over year. VividCortex is the first unified suite of database monitoring tools specifically desi...
Saviynt Inc. has announced the availability of the next release of Saviynt for AWS. The comprehensive security and compliance solution provides a Command-and-Control center to gain visibility into risks in AWS, enforce real-time protection of critical workloads as well as data and automate access life-cycle governance. The solution enables AWS customers to meet their compliance mandates such as ITAR, SOX, PCI, etc. by including an extensive risk and controls library to detect known threats and b...
You have your devices and your data, but what about the rest of your Internet of Things story? Two popular classes of technologies that nicely handle the Big Data analytics for Internet of Things are Apache Hadoop and NoSQL. Hadoop is designed for parallelizing analytical work across many servers and is ideal for the massive data volumes you create with IoT devices. NoSQL databases such as Apache HBase are ideal for storing and retrieving IoT data as “time series data.”
Clearly the way forward is to move to cloud be it bare metal, VMs or containers. One aspect of the current public clouds that is slowing this cloud migration is cloud lock-in. Every cloud vendor is trying to make it very difficult to move out once a customer has chosen their cloud. In his session at 17th Cloud Expo, Naveen Nimmu, CEO of Clouber, Inc., will advocate that making the inter-cloud migration as simple as changing airlines would help the entire industry to quickly adopt the cloud wit...