Click here to close now.

Welcome!

Cloud Expo Authors: Glenn Rossman, Marty Puranik, Max Katz, Elizabeth White, Liz McMillan

Related Topics: Virtualization, Java, SOA & WOA, Cloud Expo, Security, SDN Journal

Virtualization: Blog Feed Post

Bare Metal Blog: Mean Time Between Failures

MTBF has meaning well beyond storage

If you are new to the Bare Metal Blog series, find them all here

When assembling a model – any model, from a highly detailed functional replica of an engine to a mass produced plastic model of an airplane – there are several places where things can go wrong. The final product is only as good as the model kit, the glue used, the tools used, and the skill of the craftsman. I’ve seen the same exact model assembled and painted by two different people that look completely different, simply because of the array of variables and how they interact.

This is true of high tech equipment also, and like modeling, it is often overlooked. Interestingly, in my entire IT career, MTBF has only been a measure that meant a ton in two circumstances: When designing hardware and scoping the parts to go in it, and when talking about storage. In all other endeavors, MTBF if mentioned was a side note.

And yet it matters. It can matter a lot. Like most hardware companies (because we spec our own parts and monitor our own quality), we track MTBF both computed from the sum of the parts with average environmental considerations, and actual tracking based upon support cases involving hardware and RMAs. For us, knowing helps us improve quality. For customers, knowing helps gauge the bounds of useful life for the equipment being purchased. Of course, MTBF is a mean, not a fact, and it is entirely possible for a device to last much longer than its MTBF, in fact the fact that it is a mean kind of implies that roughly half of the devices out there will last longer. But it’s the mean, not the median, and most IT shops do not want to plan like a device will last well beyond its MTBF value. MTBF can offer a bit of guidance when it is fairly calculated, and another tool in the evaluation toolbox never hurt an IT shop.

As mentioned earlier in this series, F5 sets quality standards for suppliers to meet, if they wish to continue supplying. This allows a bit better control over MTBF than doing something like “lowest bidder” or similar procurement, simply because the standards set include the quality of parts used, which all rolls into the MTBF calculations – and more importantly for most IT shops, the MTBF reality. While MTBF is a complex set of equations, you can generalize to “the MTBF of a device is as low as or lower than the MTBF of its weakest part”. That means supplier quality standards matter in a very real way. I had a RAID array fail on me once – several drives down all at the same time. The array vendor had to count that as a failure, since RAID no longer worked (thank heavens for backups!), but the failure was on the part of one of their suppliers. That’s how it is in the manufacturing world whomevers’ name is on the box gets the bad rep for quality, regardless of whose handiwork was slipshod. That is why F5’s non-stop quality monitoring program (devices are tested from before release until EOL is announced) matters a lot. It’s also why quality standards for parts suppliers matter more then getting the absolute cheapest part, as some manufacturers are wont to do.

I will not replicate our entire knowledge base article here, if you have an ask.f5.com account, you can click here to read it. I’ll just summarize and pull bits out for the readers’ enjoyment.

F5 gear runs the gauntlet from entry level to massive blade systems. As such, MTBF varies from device to device. The worst calculated MTBF for an F5 device is over three years. And our quality team tells me that the calculated value is far lower than the real-life-experience value they get from watching returns and such. The best calculated MTBF is over 21 years. It’s a rare piece of computer gear that is used that long, but Lori and I have got some pretty old F5 gear that’s still clipping away like it was new, so no surprises there. Most F5 devices fall somewhere in between.

Why the large variance in MTBFs if we control for quality? A valid question. The fact is that it is not all about the quality of parts. Airflow inside the device, number of redundant parts, number of removable parts… there are a zillion other things that go into MTBF, and they all tend to get better as the device gets physically larger. Entry level devices are small, restricting airflow and cutting down on available space for redundant power supplies, etc. While the top end blade servers have room for all of that, and since cards are replaceable, tend to less failures. You will find a similar spread with any other vendor that covers such a wide range of hardware. And all of those numbers are likely to beat out a COTS server running a software product.

So when looking at any electronic gear, ask about MTBF. Alone it simply gives you insight into the priorities for the device you’re looking at, when combined with the MTBF numbers from several different devices (the same manufacturer or multiple), it gives you an idea of what you are buying in terms of quality. Of course with a large chunk of any given appliance handled in software, MTBF is not as meaningful as it once was, but it is still the underlying bedrock for that software to run on.

Read the original blog entry...

More Stories By Don MacVittie

Don MacVittie is Founder of Ingrained Technology, LLC, specializing in Development, Devops, and Cloud Strategy. Previously, he was a Technical Marketing Manager at F5 Networks. As an industry veteran, MacVittie has extensive programming experience along with project management, IT management, and systems/network administration expertise.

Prior to joining F5, MacVittie was a Senior Technology Editor at Network Computing, where he conducted product research and evaluated storage and server systems, as well as development and outsourcing solutions. He has authored numerous articles on a variety of topics aimed at IT professionals. MacVittie holds a B.S. in Computer Science from Northern Michigan University, and an M.S. in Computer Science from Nova Southeastern University.

@CloudExpo Stories
The speed of product development has increased massively in the past 10 years. At the same time our formal secure development and SDL methodologies have fallen behind. This forces product developers to choose between rapid release times and security. In his session at DevOps Summit, Michael Murray, Director of Cyber Security Consulting and Assessment at GE Healthcare, examined the problems and presented some solutions for moving security into the DevOps lifecycle to ensure that we get fast AND ...
Red Hat has launched the Red Hat Cloud Innovation Practice, a new global team of experts that will assist companies with more quickly on-ramping to the cloud. They will do this by providing solutions and services such as validated designs with reference architectures and agile methodology consulting, training, and support. The Red Hat Cloud Innovation Practice is born out of the integration of technology and engineering expertise gained through the company’s 2014 acquisitions of leading Ceph s...
The Workspace-as-a-Service (WaaS) market will grow to $6.4B by 2018. In his session at 16th Cloud Expo, Seth Bostock, CEO of IndependenceIT, will begin by walking the audience through the evolution of Workspace as-a-Service, where it is now vs. where it going. To look beyond the desktop we must understand exactly what WaaS is, who the users are, and where it is going in the future. IT departments, ISVs and service providers must look to workflow and automation capabilities to adapt to growing ...
Docker is becoming very popular--we are seeing every major private and public cloud vendor racing to adopt it. It promises portability and interoperability, and is quickly becoming the currency of the Cloud. In his session at DevOps Summit, Bart Copeland, CEO of ActiveState, discussed why Docker is so important to the future of the cloud, but will also take a step back and show that Docker is actually only one piece of the puzzle. Copeland will outline the bigger picture of where Docker fits a...
Since 2008 and for the first time in history, more than half of humans live in urban areas, urging cities to become “smart.” Today, cities can leverage the wide availability of smartphones combined with new technologies such as Beacons or NFC to connect their urban furniture and environment to create citizen-first services that improve transportation, way-finding and information delivery. In her session at @ThingsExpo, Laetitia Gazel-Anthoine, CEO of Connecthings, will focus on successful use c...
Software Defined Storage provides many benefits for customers including agility, flexibility, faster adoption of new technology and cost effectiveness. However, for IT organizations it can be challenging and complex to build your Enterprise Grade Storage from software. In his session at Cloud Expo, Paul Turner, CMO at Cloudian, looked at the new Original Design Manufacturer (ODM) market and how it is changing the storage world. Now Software Defined Storage companies can build Enterprise grade ...
The Internet of Things (IoT) promises to evolve the way the world does business; however, understanding how to apply it to your company can be a mystery. Most people struggle with understanding the potential business uses or tend to get caught up in the technology, resulting in solutions that fail to meet even minimum business goals. In his session at @ThingsExpo, Jesse Shiah, CEO / President / Co-Founder of AgilePoint Inc., showed what is needed to leverage the IoT to transform your business. ...
Hadoop as a Service (as offered by handful of niche vendors now) is a cloud computing solution that makes medium and large-scale data processing accessible, easy, fast and inexpensive. In his session at Big Data Expo, Kumar Ramamurthy, Vice President and Chief Technologist, EIM & Big Data, at Virtusa, will discuss how this is achieved by eliminating the operational challenges of running Hadoop, so one can focus on business growth. The fragmented Hadoop distribution world and various PaaS soluti...
IoT is still a vague buzzword for many people. In his session at @ThingsExpo, Mike Kavis, Vice President & Principal Cloud Architect at Cloud Technology Partners, discussed the business value of IoT that goes far beyond the general public's perception that IoT is all about wearables and home consumer services. He also discussed how IoT is perceived by investors and how venture capitalist access this space. Other topics discussed were barriers to success, what is new, what is old, and what th...
Advanced Persistent Threats (APTs) are increasing at an unprecedented rate. The threat landscape of today is drastically different than just a few years ago. Attacks are much more organized and sophisticated. They are harder to detect and even harder to anticipate. In the foreseeable future it's going to get a whole lot harder. Everything you know today will change. Keeping up with this changing landscape is already a daunting task. Your organization needs to use the latest tools, methods and ex...
In his session at DevOps Summit, Tapabrata Pal, Director of Enterprise Architecture at Capital One, will tell a story about how Capital One has embraced Agile and DevOps Security practices across the Enterprise – driven by Enterprise Architecture; bringing in Development, Operations and Information Security organizations together. Capital Ones DevOpsSec practice is based upon three "pillars" – Shift-Left, Automate Everything, Dashboard Everything. Within about three years, from 100% waterfall, C...
Disruptive macro trends in technology are impacting and dramatically changing the "art of the possible" relative to supply chain management practices through the innovative use of IoT, cloud, machine learning and Big Data to enable connected ecosystems of engagement. Enterprise informatics can now move beyond point solutions that merely monitor the past and implement integrated enterprise fabrics that enable end-to-end supply chain visibility to improve customer service delivery and optimize sup...
Dale Kim is the Director of Industry Solutions at MapR. His background includes a variety of technical and management roles at information technology companies. While his experience includes work with relational databases, much of his career pertains to non-relational data in the areas of search, content management, and NoSQL, and includes senior roles in technical marketing, sales engineering, and support engineering. Dale holds an MBA from Santa Clara University, and a BA in Computer Science f...
Docker has acquired software-defined networking (SDN) startup SocketPlane. SocketPlane, which was founded in Q4, 2014, with a vision of delivering Docker-native networking, has been an active participant in shaping the initial efforts around Docker’s open API for networking. The explicit focus of the SocketPlane team within Docker will be on collaborating with the partner community to complete a rich set of networking APIs that addresses the needs of application developers and network and system...
It’s been proven time and time again that in tech, diversity drives greater innovation, better team productivity and greater profits and market share. So what can we do in our DevOps teams to embrace diversity and help transform the culture of development and operations into a true “DevOps” team? In her session at DevOps Summit, Stefana Muller, Director, Product Management – Continuous Delivery at CA Technologies, will answer that question citing examples, showing how to create opportunities f...
Even as cloud and managed services grow increasingly central to business strategy and performance, challenges remain. The biggest sticking point for companies seeking to capitalize on the cloud is data security. Keeping data safe is an issue in any computing environment, and it has been a focus since the earliest days of the cloud revolution. Understandably so: a lot can go wrong when you allow valuable information to live outside the firewall. Recent revelations about government snooping, along...
The cloud is now a fact of life but generating recurring revenues that are driven by solutions and services on a consumption model have been hard to implement, until now. In their session at 16th Cloud Expo, Ermanno Bonifazi, CEO & Founder of Solgenia, and Ian Khan, Global Strategic Positioning & Brand Manager at Solgenia, will discuss how a top European telco has leveraged the innovative recurring revenue generating capability of the consumption cloud to enable a unique cloud monetization mod...
FedRAMP is mandatory for government cloud deployments and businesses need to comply in order to provide services for federal engagements. In his session at 16th Cloud Expo, Abel Sussman, Director for Coalfire Public Sector practice, will review the Federal Risk and Authorization Management Program (FedRAMP) process and provide advice on overcoming common compliance obstacles.
Are your applications getting in the way of your business strategy? It’s time to rethink your IT approach. In his session at 16th Cloud Expo, Madhukar Kumar, Vice President, Product Management at Liaison Technologies, will discuss a new data-centric approach to IT that allows your data, not applications, to inform business strategy. By moving away from an application-centric IT model where data integration and analysis are subservient to the constraints of applications, your organization will b...
As organizations shift toward IT-as-a-service models, the need for managing and protecting data residing across physical, virtual, and now cloud environments grows with it. CommVault can ensure protection &E-Discovery of your data – whether in a private cloud, a Service Provider delivered public cloud, or a hybrid cloud environment – across the heterogeneous enterprise. In his session at 16th Cloud Expo, Randy De Meno, Chief Technologist - Windows Products and Microsoft Partnerships, will disc...