@CloudExpo Authors: Courtney Abud, Elizabeth White, Carmen Gonzalez, Pat Romanski, Liz McMillan

Related Topics: @CloudExpo, @DXWorldExpo, @ThingsExpo

@CloudExpo: Article

It’s All About the Data: #MachineLearning | @CloudExpo #IoT #ML #BigData

The goal of machine learning sounds simple: provide systems with the ability to learn based on the information provided them

Big Data. Analytics. Internet of Things. Cloud. In the last few years, you cannot have a discussion around technology without those terms entering the conversation. They have been major technology disruptors impacting all aspects of the business. Change seems to occur at breakneck speeds and shows no sign of slowing. Today, it appears the one constant in technology is change. Constant change requires constant innovation which thereby introduces more new technologies. One of the new technologies entering the conversation is machine learning. Gartner identified machine learning as one of the top 10 technology trends for 2016. It is definitely a hot topic.

Everything old is new again
What I find fascinating about machine learning is that the basic tenets harken back to the '70s and '80s in the early years of artificial intelligence research. The work at that time was constrained by compute capacity and amount of data available. This is the key that has enabled machine learning to leap forward in recent years - both of those constraints no longer hold. Compute cycles and data are available at levels unimagined just decades ago.

The goal of machine learning sounds simple: provide systems with the ability to learn based on the information provided them. Simple as it sounds, this is counter to classic software engineering and has its challenges. Most software development we are familiar with ‘hard codes' the systems behavior based on planned and anticipated user and data interactions. The standard ‘if-then-else' model.

The algorithms required for artificial intelligence/machine learning are much more complex. They need to allow the system to develop its own analytical models based on inputs. Those models are constantly changing based on the information provided. Based on the data and those models, behavior is determined. As you can tell from the description, this results in very non-deterministic behavior. The system will analyze, interpret and react based on the information provided, modify that behavior as more information, and then feedback is provided. The analysis and behavior is constantly changing and being refined over time. Imagine developing the test suite for that system! (A topic for future discussion).

You are already reaping the benefits of machine learning
Do you have a Netflix account? Or Amazon? Both Netflix and Amazon provide a ‘recommended for you' list every time you log in. Both companies have very complex, proprietary algorithms analyzing the huge repository of information about you and all their member's transactions. Based on that information, they develop models of your expected behavior and present a list of recommendations to you. How you react to those recommendations is also fed back into the algorithms, constantly tweaking and adjusting your behavior model.

Or how about your smart phone? Think for a moment about the complexity of the simple statement, "Siri, what is the weather forecast for today?" First the software needs to be able to understand your voice, your accent, and your way of speech in order to be able to determine the actual words being spoken. If it's not sure, the software asks for clarification, and it learns from the clarification. Each time you use it, your phone gets better at understanding what you are saying. Once it understands the words, it has to process natural language into something meaningful to the system. This again requires complex algorithms analyzing the information, creating a model, and executing on its interpretation. As with parsing the words, if it's not sure, the software will prompt for clarification. That clarification will be fed back into the system that models your way of speaking and the context of the language you use.

It's all about the data
In a recent article on TechCrunch, ‘How startups can compete with enterprises in artificial intelligence and machine learning' John Melas-Kyriazi refers to data as the ‘fuel we feed into training machine learning models that can create powerful network effects at scale.' I find that a very apt analogy. The complex algorithms and models are the engine of machine learning, but without fuel, the engine - the data - won't work very well, if at all. A colleague of mine, John Williams, (Chief Strategy Officer at Collaborative Consulting) for years has been fond of saying, "It's all about the data." That could not be more true than in the world of machine learning.

Given the importance of the data to the success of any machine learning implementation, there are some key considerations to take into account:

  • Data Quality - In the world of data, this has always been an important consideration. Data cleansing and scrubbing is standard practice already in many organizations. It has become critical for machine learning implementations. Putting dirty fuel into even the best engine will bring it to a grinding halt.
  • Data Volume - Big Data is tailor-made for machine learning. The more information the algorithms and subsequent models have to work with, the better the results. The key word here is learning. We as individuals learn more as more information is provided to us. This concept is directly applicable in the machine learning world.
  • Data Timeliness - Besides volume, new and timely data is also a consideration. If the machine learning is based on a large volume of data that is completely outdated, the resulting models will not be very useful.
  • Data Pedigree - Where did the data come from? Is it a valid source? The pedigree is less important when using internal systems, as the source is well known, but many machine learning systems will be getting their data from public sources. Or potentially, from the many devices in the world of the Internet of Things. Crowd-sourcing data (for example Waze, a GPS mobile app) requires extra effort to ensure you trust the information being consumed. Imagine a new kind of cyber-attack - feeding your machine learning system bad data to impact the results. Remember Microsoft's problem with its AI Chatbot Tay learning to be a racist?

No technology negates the need for good design and planning
There is no doubt machine learning technology has amazing potential at impacting businesses across the spectrum, whether it will be in healthcare for diagnosing Alzheimer's disease to self-driving cars that were once in the realm of science fiction. No technology negates the need for good design and planning; machine learning is no different. As technologists, it's our responsibility to ensure the proper efforts have been made to supply machine learning implementations with the best fuel possible. Understanding the quality, volume, timeliness, and pedigree needs of these systems can help us navigate this new world of machine learning, leading us to successful execution, and, ultimately, providing value back to the business.

More Stories By Ed Featherston

Ed Featherston is VP, Principal Architect at Cloud Technology Partners. He brings 35 years of technology experience in designing, building, and implementing large complex solutions. He has significant expertise in systems integration, Internet/intranet, and cloud technologies. He has delivered projects in various industries, including financial services, pharmacy, government and retail.

Comments (0)

Share your thoughts on this story.

Add your comment
You must be signed in to add a comment. Sign-in | Register

In accordance with our Comment Policy, we encourage comments that are on topic, relevant and to-the-point. We will remove comments that include profanity, personal attacks, racial slurs, threats of violence, or other inappropriate material that violates our Terms and Conditions, and will block users who make repeated violations. We ask all readers to expect diversity of opinion and to treat one another with dignity and respect.

CloudEXPO Stories
Where many organizations get into trouble, however, is that they try to have a broad and deep knowledge in each of these areas. This is a huge blow to an organization's productivity. By automating or outsourcing some of these pieces, such as databases, infrastructure, and networks, your team can instead focus on development, testing, and deployment. Further, organizations that focus their attention on these areas can eventually move to a test-driven development structure that condenses several long phases into a faster, more efficient process. This methodology has a name, of course: Continuous delivery. As Jones pointed out at CloudEXPO, continuous delivery allows developers to trim the fat off tasks and gives them more time to focus on the individual parts of the process. But remember-implementing this methodology requires organizations to offload management of databases, infrastruct...
As the digitization of business accelerates the move of critical applications and content to the cloud, the network has never been as critical to business success. Consuming everything ‘as-a-service' requires new levels of network automation, agility and security. Discover how Enterprises can take advantage of Digital Platforms, directly connecting to an extensive ecosystem of digital partners and flex their service at the click of a button.
On-premise or off, you have powerful tools available to maximize the value of your infrastructure and you demand more visibility and operational control. Fortunately, data center management tools keep a vigil on memory contestation, power, thermal consumption, server health, and utilization, allowing better control no matter your cloud's shape. In this session, learn how Intel software tools enable real-time monitoring and precise management to lower operational costs and optimize infrastructure for today even as you're forecasting for tomorrow.
In today's always-on world, customer expectations have changed. Competitive differentiation is delivered through rapid software innovations, the ability to respond to issues quickly and by releasing high-quality code with minimal interruptions. DevOps isn't some far off goal; it's methodologies and practices are a response to this demand. The demand to go faster. The demand for more uptime. The demand to innovate. In this keynote, we will cover the Nutanix Developer Stack. Built from the foundation of software-defined infrastructure, Nutanix has rapidly expanded into full application lifecycle management across any infrastructure or cloud .Join us as we delve into how the Nutanix Developer Stack makes it easy to build hybrid cloud applications by weaving DBaaS, micro segmentation, event driven lifecycle operations, and both financial and cloud governance together into a single unified st...
CloudEXPO has been the M&A capital for Cloud companies for more than a decade with memorable acquisition news stories which came out of CloudEXPO expo floor. DevOpsSUMMIT New York faculty member Greg Bledsoe shared his views on IBM's Red Hat acquisition live from NASDAQ floor. Acquisition news was announced during CloudEXPO New York which took place November 12-13, 2019 in New York City. Our Silicon Valley 2019 schedule will showcase 200 keynotes, sessions, general sessions, power panels, and hands on tutorials presented by 150 rockstar speakers in 10 hottest conference tracks of 2019: