Welcome!

@CloudExpo Authors: Yeshim Deniz, Pat Romanski, Jason Bloomberg, Zakia Bouachraoui, Elizabeth White

Related Topics: @CloudExpo, Containers Expo Blog

@CloudExpo: Blog Post

IT Change Management: The Foundation to the Cloud

Getting a better handle on inventory management and resource relationships

IT shops continually struggle with keeping resource documentation up to date. Too often IT departments seem resigned to accepting mediocre results, as if the world conspires against them. Resource tracking is a prime example of this. Fundamentally, the different organizational aspects refuse to comply with these manual processes to keep a CMDB updated. This creates the number one barrier to moving to some form of a cloud-like delivery model.

Furthermore, advancements in virtualization have made the situation worse, as software resources have been severed from physical resources. Traditional resource tracking methods are further confused as the virtual resources change their physical location over time.

This reinforces the protests that an application's dependency profile is too complicated to be 100 percent accurate. Over time, IT has simply accepted the fact that it is totally impractical to keep the physical architecture records of systems up to date manually. Instead of trying to maintain the documentation, it's generated when needed through a time-consuming trial-and-error process and, by the time it's done, it's hopelessly out of date.

Not only is this an inefficient use of people's time, it's no way to manage some of the most important resources in an organization. These IT resources have become a digital backbone upon which most organizations are totally dependent for their very survival. The cavalier attitude toward managing IT resource inventory is dangerous at best and can prove to be catastrophic.

For instance, it's just a matter of time until the next large outage occurs. Despite the best-laid disaster recovery plans, technology has an uncanny ability to fail in unexpected ways; the combinatorial possibilities are immense. Having accurate application maps can make the difference between being the hero and the goat. If individual self-preservation isn't motivating enough, there are other practical organizational reasons for keeping this information available: datacenter moves, chargeback models, and consumption analysis to name a few.

Generally this lack of inventory control has been accepted as the status quo despite the clear risks. In no other industry would such poor controls be acceptable. Imagine Wal-Mart's CEO explaining a lack of understanding of store inventory to Wall Street analysts. If such a story broke, he'd be out of a job before the ink dried. IT management shouldn't be allowed to get away with it either.

This is exceedingly frustrating as application dependency mapping tools have become mature offerings. An application-dependency mapping tool uncovers the otherwise hidden relationships between applications and infrastructure resources. In our experience as former IT practitioners, we have implemented such capabilities using tools from CA, BMC and IBM.

Our main goal was to get a better handle on inventory management and resource relationships for a datacenter move, but we quickly realized that there were many additional benefits we could take advantage of:

  • Identification of idle servers for reclamation (we were shocked at how many)
  • Identification of legacy infrastructure that poses a security threat (such as Windows 95/98)
  • Identification of production servers using development or test resources
  • Identification of changes to an environment over time, especially useful for figuring out "what changed" for debugging purposes

In addition, these tools provide a repository that supports a cradle-to-grave resource management policy.

What we found helpful was how these tools log all changes that occur to a server over time, not just the last scanning sweep, even if a resource isn't available for an extended period of time,

Bottom line - to institute change that exploits cloud-like delivery models in the enterprise, firms must grab a hold of their IT change management.

More Stories By Tony Bishop

Blueprint4IT is authored by a longtime IT and Datacenter Technologist. Author of Next Generation Datacenters in Financial Services – Driving Extreme Efficiency and Effective Cost Savings. A former technology executive for both Morgan Stanley and Wachovia Securities.

Comments (0)

Share your thoughts on this story.

Add your comment
You must be signed in to add a comment. Sign-in | Register

In accordance with our Comment Policy, we encourage comments that are on topic, relevant and to-the-point. We will remove comments that include profanity, personal attacks, racial slurs, threats of violence, or other inappropriate material that violates our Terms and Conditions, and will block users who make repeated violations. We ask all readers to expect diversity of opinion and to treat one another with dignity and respect.


CloudEXPO Stories
Lori MacVittie is a subject matter expert on emerging technology responsible for outbound evangelism across F5's entire product suite. MacVittie has extensive development and technical architecture experience in both high-tech and enterprise organizations, in addition to network and systems administration expertise. Prior to joining F5, MacVittie was an award-winning technology editor at Network Computing Magazine where she evaluated and tested application-focused technologies including app security and encryption-related solutions. She holds a B.S. in Information and Computing Science from the University of Wisconsin at Green Bay, and an M.S. in Computer Science from Nova Southeastern University, and is an O'Reilly author.
Dynatrace is an application performance management software company with products for the information technology departments and digital business owners of medium and large businesses. Building the Future of Monitoring with Artificial Intelligence. Today we can collect lots and lots of performance data. We build beautiful dashboards and even have fancy query languages to access and transform the data. Still performance data is a secret language only a couple of people understand. The more business becomes digital the more stakeholders are interested in this data including how it relates to business. Some of these people have never used a monitoring tool before. They have a question on their mind like "How is my application doing" but no idea how to get a proper answer.
Having been in the web hosting industry since 2002, dhosting has gained a great deal of experience while working on a wide range of projects. This experience has enabled the company to develop our amazing new product, which they are now excited to present! Among dHosting's greatest achievements, they can include the development of their own hosting panel, the building of their fully redundant server system, and the creation of dhHosting's unique product, Dynamic Edge.
Your job is mostly boring. Many of the IT operations tasks you perform on a day-to-day basis are repetitive and dull. Utilizing automation can improve your work life, automating away the drudgery and embracing the passion for technology that got you started in the first place. In this presentation, I'll talk about what automation is, and how to approach implementing it in the context of IT Operations. Ned will discuss keys to success in the long term and include practical real-world examples. Get started on automating your way to a brighter future!
The challenges of aggregating data from consumer-oriented devices, such as wearable technologies and smart thermostats, are fairly well-understood. However, there are a new set of challenges for IoT devices that generate megabytes or gigabytes of data per second. Certainly, the infrastructure will have to change, as those volumes of data will likely overwhelm the available bandwidth for aggregating the data into a central repository. Ochandarena discusses a whole new way to think about your next-gen applications and how to address the challenges of building applications that harness all data types and sources.