Welcome!

@CloudExpo Authors: Sunil Rajasekar, Pat Romanski, Elizabeth White, Liz McMillan, Cynthia Dunlop

Related Topics: Microservices Expo, Java IoT, Microsoft Cloud, Containers Expo Blog, Agile Computing, @CloudExpo, @BigDataExpo, SDN Journal, OpenStack Journal

Microservices Expo: Blog Feed Post

Bare Metal Blog: Testing for Numbers or Performance?

What you test can say a lot about you

Along the lines of the first blog in the testing portion of the Bare Metal Blog series, I’d like to talk a bit more about how the testing environment, the device configuration, and the payloads translate into test results.

One of the problems most advanced mass education systems run into is the question of standardized testing. While it is true that you cannot fix what you have not determined is broken, like most things involving people, testing students for specific areas of knowledge does kind of guarantee that those doing the teaching will err on the side of preparing students to take the test rather than to succeed in life. The mere fact that there IS a test changes what is taught. It is of course possible to a make this into a massively positive proposition by targeting the standardized tests at the most important things students need to  learn, but for our discussion purposes, the result is the same – the students will be taught to whatever is on that test first, and all else secondarily.

This is far too often true of vendor product testing also. The mere fact that there will be a test of the equipment, and most high-tech markets being highly competitive, makes things lean toward tweaking the device (or the test) to maximize test performance, in spite of what the real world performance will be.

The current most flagrant problem with testing is a variant on an old theme. Way back when testing the throughput of network switches made sense, there was a lot of “packets per second” testing with no payload. Well, you test the ability of the switch to send packets to the right place, but do not at all test the device in a manner consistent with the real world usage of switches. Today we have a whole slew of similar tests for ADCs. The purpose of an ADC is to load balance, optimize, and if needed secure the passage of packets. Primarily this is for application traffic because they’re Application Delivery Controllers. Yet, application traffic being layer seven kind of means that you need to do some layer seven decision-making if the device is to be tested in the real world. If the packet is a layer seven packet, but layer four switching is all that is performed on it, the test is completely useless to determining the actual capabilities of the device. And yet there is a lot of that type of testing going on out there right now.  It’s time – way past time – to drive testing into the real world for ADCs. Layer seven decision making is much more complex and requires a deep look at the packets in question, meaning that the results will not be nearly as pretty as simple layer four switching packets are. While you cannot do a direct comparison of all of the optional features of two different ADCs simply because the level of optional functionality support is so broad once a solid ADC platform is deployed, but you can test the basic capabilities and responsiveness of the core products.

And that is what we, as an industry must begin to insist on. I use one single oddity in ADC testing here, but every branch of high-tech testing I’ve been involved in over the years – security, network gear, storage, application – all have similar “this is not good enough” testing that we need to demand is dropped in favor of solid testing that reflects a real-world device. Not your real-world device unless you are running the test lab, but a real-world device that is seeing – and more importantly acting upon – data that the device will encounter in an actual network, doing the job it was designed for.

As I mentioned in the last testing installment, you can make an ADC look astounding if your tests don’t actually force it to do anything. For our public testing, we have standards, and offer up our configuration and testing goals on DevCentral. Whether you use it to validate the test results F5 uses, or to set up the tests in your own environment, publicly talking about how testing is performed is a big deal. Ask your vendor for configuration files and testing plan when numbers are tossed at you, make certain you know what they’re testing when they try to impress you with over-the-top performance numbers. In my career, I have seen cases where “double the performance of our nearest competitor” was used publicly and was as close to an outright lie as possible, since the test and configuration were different between the two products the test claimed to compare.

When you buy any form of datacenter equipment, you’re going to be stuck with it for a good long while. Make certain you know how testing that is informing your decision was performed, no matter who did the testing. Independent third party testing sometimes isn’t so independent, and knowing that can make you more cautious when hooking your company with gear you’ll have to live with.

Bare Metal Blog Series:

Read the original blog entry...

More Stories By Don MacVittie

Don MacVittie is currently a Senior Solutions Architect at StackIQ, Inc. He is also working with Mesamundi on D20PRO, and is a member of the Stacki Open Source project. He has experience in application development, architecture, infrastructure, technical writing, and IT management. MacVittie holds a B.S. in Computer Science from Northern Michigan University, and an M.S. in Computer Science from Nova Southeastern University.

@CloudExpo Stories
More and more companies are looking to microservices as an architectural pattern for breaking apart applications into more manageable pieces so that agile teams can deliver new features quicker and more effectively. What this pattern has done more than anything to date is spark organizational transformations, setting the foundation for future application development. In practice, however, there are a number of considerations to make that go beyond simply “build, ship, and run,” which changes ho...
SYS-CON Events announced today that AppNeta, the leader in performance insight for business-critical web applications, will exhibit and present at SYS-CON's @DevOpsSummit at Cloud Expo New York, which will take place on June 7-9, 2016, at the Javits Center in New York City, NY. AppNeta is the only application performance monitoring (APM) company to provide solutions for all applications – applications you develop internally, business-critical SaaS applications you use and the networks that deli...
Fortunately, meaningful and tangible business cases for IoT are plentiful in a broad array of industries and vertical markets. These range from simple warranty cost reduction for capital intensive assets, to minimizing downtime for vital business tools, to creating feedback loops improving product design, to improving and enhancing enterprise customer experiences. All of these business cases, which will be briefly explored in this session, hinge on cost effectively extracting relevant data from ...
SYS-CON Events announced today that Column Technologies will exhibit at SYS-CON's @DevOpsSummit at Cloud Expo, which will take place on June 7-9, 2016, at the Javits Center in New York City, NY. Established in 1998, Column Technologies is a global technology solutions provider with over 400 employees, headquartered in the United States with offices in Canada, India, and the United Kingdom. Column Technologies provides “Best of Breed” technology solutions that automate the key DevOps principal...
Advances in technology and ubiquitous connectivity have made the utilization of a dispersed workforce more common. Whether that remote team is located across the street or country, management styles/ approaches will have to be adjusted to accommodate this new dynamic. In his session at 17th Cloud Expo, Sagi Brody, Chief Technology Officer at Webair Internet Development Inc., focused on the challenges of managing remote teams, providing real-world examples that demonstrate what works and what do...
As enterprises work to take advantage of Big Data technologies, they frequently become distracted by product-level decisions. In most new Big Data builds this approach is completely counter-productive: it presupposes tools that may not be a fit for development teams, forces IT to take on the burden of evaluating and maintaining unfamiliar technology, and represents a major up-front expense. In his session at @BigDataExpo at @ThingsExpo, Andrew Warfield, CTO and Co-Founder of Coho Data, will dis...
Companies can harness IoT and predictive analytics to sustain business continuity; predict and manage site performance during emergencies; minimize expensive reactive maintenance; and forecast equipment and maintenance budgets and expenditures. Providing cost-effective, uninterrupted service is challenging, particularly for organizations with geographically dispersed operations.
When building large, cloud-based applications that operate at a high scale, it’s important to maintain a high availability and resilience to failures. In order to do that, you must be tolerant of failures, even in light of failures in other areas of your application. “Fly two mistakes high” is an old adage in the radio control airplane hobby. It means, fly high enough so that if you make a mistake, you can continue flying with room to still make mistakes. In his session at 18th Cloud Expo, Lee...
The Quantified Economy represents the total global addressable market (TAM) for IoT that, according to a recent IDC report, will grow to an unprecedented $1.3 trillion by 2019. With this the third wave of the Internet-global proliferation of connected devices, appliances and sensors is poised to take off in 2016. In his session at @ThingsExpo, David McLauchlan, CEO and co-founder of Buddy Platform, will discuss how the ability to access and analyze the massive volume of streaming data from mil...
Predictive analytics tools monitor, report, and troubleshoot in order to make proactive decisions about the health, performance, and utilization of storage. Most enterprises combine cloud and on-premise storage, resulting in blended environments of physical, virtual, cloud, and other platforms, which justifies more sophisticated storage analytics. In his session at 18th Cloud Expo, Peter McCallum, Vice President of Datacenter Solutions at FalconStor, will discuss using predictive analytics to ...
WebSocket is effectively a persistent and fat pipe that is compatible with a standard web infrastructure; a "TCP for the Web." If you think of WebSocket in this light, there are other more hugely interesting applications of WebSocket than just simply sending data to a browser. In his session at 18th Cloud Expo, Frank Greco, Director of Technology for Kaazing Corporation, will compare other modern web connectivity methods such as HTTP/2, HTTP Streaming, Server-Sent Events and new W3C event APIs ...
Join us at Cloud Expo | @ThingsExpo 2016 – June 7-9 at the Javits Center in New York City and November 1-3 at the Santa Clara Convention Center in Santa Clara, CA – and deliver your unique message in a way that is striking and unforgettable by taking advantage of SYS-CON's unmatched high-impact, result-driven event / media packages.
At first adopted by enterprises to consolidate physical servers, virtualization is now widely used in cloud computing to offer elasticity and scalability. On the other hand, Docker has developed a new way to handle Linux containers, inspired by version control software such as Git, which allows you to keep all development versions. In his session at 17th Cloud Expo, Dominique Rodrigues, the co-founder and CTO of Nanocloud Software, discussed how in order to also handle QEMU / KVM virtual machin...
SYS-CON Events announced today that FalconStor Software® Inc., a 15-year innovator of software-defined storage solutions, will exhibit at SYS-CON's 18th International Cloud Expo®, which will take place on June 7-9, 2016, at the Javits Center in New York City, NY. FalconStor Software®, Inc. (NASDAQ: FALC) is a leading software-defined storage company offering a converged, hardware-agnostic, software-defined storage and data services platform. Its flagship solution FreeStor®, utilizes a horizonta...
Sensors and effectors of IoT are solving problems in new ways, but small businesses have been slow to join the quantified world. They’ll need information from IoT using applications as varied as the businesses themselves. In his session at @ThingsExpo, Roger Meike, Distinguished Engineer, Director of Technology Innovation at Intuit, showed how IoT manufacturers can use open standards, public APIs and custom apps to enable the Quantified Small Business. He used a Raspberry Pi to connect sensors...
The cloud promises new levels of agility and cost-savings for Big Data, data warehousing and analytics. But it’s challenging to understand all the options – from IaaS and PaaS to newer services like HaaS (Hadoop as a Service) and BDaaS (Big Data as a Service). In her session at @BigDataExpo at @ThingsExpo, Hannah Smalltree, a director at Cazena, will provide an educational overview of emerging “as-a-service” options for Big Data in the cloud. This is critical background for IT and data profes...
Father business cycles and digital consumers are forcing enterprises to respond faster to customer needs and competitive demands. Successful integration of DevOps and Agile development will be key for business success in today’s digital economy. In his session at DevOps Summit, Pradeep Prabhu, Co-Founder & CEO of Cloudmunch, covered the critical practices that enterprises should consider to seamlessly integrate Agile and DevOps processes, barriers to implementing this in the enterprise, and pr...
Eighty percent of a data scientist’s time is spent gathering and cleaning up data, and 80% of all data is unstructured and almost never analyzed. Cognitive computing, in combination with Big Data, is changing the equation by creating data reservoirs and using natural language processing to enable analysis of unstructured data sources. This is impacting every aspect of the analytics profession from how data is mined (and by whom) to how it is delivered. This is not some futuristic vision: it's ha...
Let’s face it, embracing new storage technologies, capabilities and upgrading to new hardware often adds complexity and increases costs. In his session at 18th Cloud Expo, Seth Oxenhorn, Vice President of Business Development & Alliances at FalconStor, will discuss how a truly heterogeneous software-defined storage approach can add value to legacy platforms and heterogeneous environments. The result reduces complexity, significantly lowers cost, and provides IT organizations with improved effi...
Cognitive Computing is becoming the foundation for a new generation of solutions that have the potential to transform business. Unlike traditional approaches to building solutions, a cognitive computing approach allows the data to help determine the way applications are designed. This contrasts with conventional software development that begins with defining logic based on the current way a business operates. In her session at 18th Cloud Expo, Judith S. Hurwitz, President and CEO of Hurwitz & ...