Welcome!

@CloudExpo Authors: Zakia Bouachraoui, Pat Romanski, Liz McMillan, Elizabeth White, David Linthicum

Related Topics: @CloudExpo, Containers Expo Blog

@CloudExpo: Blog Post

IT Change Management: The Foundation to the Cloud

Getting a better handle on inventory management and resource relationships

IT shops continually struggle with keeping resource documentation up to date. Too often IT departments seem resigned to accepting mediocre results, as if the world conspires against them. Resource tracking is a prime example of this. Fundamentally, the different organizational aspects refuse to comply with these manual processes to keep a CMDB updated. This creates the number one barrier to moving to some form of a cloud-like delivery model.

Furthermore, advancements in virtualization have made the situation worse, as software resources have been severed from physical resources. Traditional resource tracking methods are further confused as the virtual resources change their physical location over time.

This reinforces the protests that an application's dependency profile is too complicated to be 100 percent accurate. Over time, IT has simply accepted the fact that it is totally impractical to keep the physical architecture records of systems up to date manually. Instead of trying to maintain the documentation, it's generated when needed through a time-consuming trial-and-error process and, by the time it's done, it's hopelessly out of date.

Not only is this an inefficient use of people's time, it's no way to manage some of the most important resources in an organization. These IT resources have become a digital backbone upon which most organizations are totally dependent for their very survival. The cavalier attitude toward managing IT resource inventory is dangerous at best and can prove to be catastrophic.

For instance, it's just a matter of time until the next large outage occurs. Despite the best-laid disaster recovery plans, technology has an uncanny ability to fail in unexpected ways; the combinatorial possibilities are immense. Having accurate application maps can make the difference between being the hero and the goat. If individual self-preservation isn't motivating enough, there are other practical organizational reasons for keeping this information available: datacenter moves, chargeback models, and consumption analysis to name a few.

Generally this lack of inventory control has been accepted as the status quo despite the clear risks. In no other industry would such poor controls be acceptable. Imagine Wal-Mart's CEO explaining a lack of understanding of store inventory to Wall Street analysts. If such a story broke, he'd be out of a job before the ink dried. IT management shouldn't be allowed to get away with it either.

This is exceedingly frustrating as application dependency mapping tools have become mature offerings. An application-dependency mapping tool uncovers the otherwise hidden relationships between applications and infrastructure resources. In our experience as former IT practitioners, we have implemented such capabilities using tools from CA, BMC and IBM.

Our main goal was to get a better handle on inventory management and resource relationships for a datacenter move, but we quickly realized that there were many additional benefits we could take advantage of:

  • Identification of idle servers for reclamation (we were shocked at how many)
  • Identification of legacy infrastructure that poses a security threat (such as Windows 95/98)
  • Identification of production servers using development or test resources
  • Identification of changes to an environment over time, especially useful for figuring out "what changed" for debugging purposes

In addition, these tools provide a repository that supports a cradle-to-grave resource management policy.

What we found helpful was how these tools log all changes that occur to a server over time, not just the last scanning sweep, even if a resource isn't available for an extended period of time,

Bottom line - to institute change that exploits cloud-like delivery models in the enterprise, firms must grab a hold of their IT change management.

More Stories By Tony Bishop

Blueprint4IT is authored by a longtime IT and Datacenter Technologist. Author of Next Generation Datacenters in Financial Services – Driving Extreme Efficiency and Effective Cost Savings. A former technology executive for both Morgan Stanley and Wachovia Securities.

Comments (0)

Share your thoughts on this story.

Add your comment
You must be signed in to add a comment. Sign-in | Register

In accordance with our Comment Policy, we encourage comments that are on topic, relevant and to-the-point. We will remove comments that include profanity, personal attacks, racial slurs, threats of violence, or other inappropriate material that violates our Terms and Conditions, and will block users who make repeated violations. We ask all readers to expect diversity of opinion and to treat one another with dignity and respect.


CloudEXPO Stories
Nutanix has been named "Platinum Sponsor" of CloudEXPO | DevOpsSUMMIT | DXWorldEXPO New York, which will take place November 12-13, 2018 in New York City. Nutanix makes infrastructure invisible, elevating IT to focus on the applications and services that power their business. The Nutanix Enterprise Cloud Platform blends web-scale engineering and consumer-grade design to natively converge server, storage, virtualization and networking into a resilient, software-defined solution with rich machine intelligence.
Concerns about security, downtime and latency, budgets, and general unfamiliarity with cloud technologies continue to create hesitation for many organizations that truly need to be developing a cloud strategy. Hybrid cloud solutions are helping to elevate those concerns by enabling the combination or orchestration of two or more platforms, including on-premise infrastructure, private clouds and/or third-party, public cloud services. This gives organizations more comfort to begin their digital transformation without a complete overhaul of their existing infrastructure - serving as a sort of "missing link" for transition to cloud utilization.
Digital transformation is about embracing digital technologies into a company's culture to better connect with its customers, automate processes, create better tools, enter new markets, etc. Such a transformation requires continuous orchestration across teams and an environment based on open collaboration and daily experiments. In his session at 21st Cloud Expo, Alex Casalboni, Technical (Cloud) Evangelist at Cloud Academy, explored and discussed the most urgent unsolved challenges to achieve full cloud literacy in the enterprise world.
Wasabi is the hot cloud storage company delivering low-cost, fast, and reliable cloud storage. Wasabi is 80% cheaper and 6x faster than Amazon S3, with 100% data immutability protection and no data egress fees. Created by Carbonite co-founders and cloud storage pioneers David Friend and Jeff Flowers, Wasabi is on a mission to commoditize the storage industry. Wasabi is a privately held company based in Boston, MA. Follow and connect with Wasabi on Twitter, Facebook, Instagram and the Wasabi blog.
Digital Transformation and Disruption, Amazon Style - What You Can Learn. Chris Kocher is a co-founder of Grey Heron, a management and strategic marketing consulting firm. He has 25+ years in both strategic and hands-on operating experience helping executives and investors build revenues and shareholder value. He has consulted with over 130 companies on innovating with new business models, product strategies and monetization. Chris has held management positions at HP and Symantec in addition to advisory roles at startups. He has worked extensively on monetization, SAAS, IoT, ecosystems, partnerships and accelerating growth in new business initiatives.