Welcome!

@CloudExpo Authors: Liz McMillan, Yeshim Deniz, Elizabeth White, Pat Romanski, Aruna Ravichandran

Related Topics: @CloudExpo, @BigDataExpo, @ThingsExpo

@CloudExpo: Article

It’s All About the Data: #MachineLearning | @CloudExpo #IoT #ML #BigData

The goal of machine learning sounds simple: provide systems with the ability to learn based on the information provided them

Big Data. Analytics. Internet of Things. Cloud. In the last few years, you cannot have a discussion around technology without those terms entering the conversation. They have been major technology disruptors impacting all aspects of the business. Change seems to occur at breakneck speeds and shows no sign of slowing. Today, it appears the one constant in technology is change. Constant change requires constant innovation which thereby introduces more new technologies. One of the new technologies entering the conversation is machine learning. Gartner identified machine learning as one of the top 10 technology trends for 2016. It is definitely a hot topic.

Everything old is new again
What I find fascinating about machine learning is that the basic tenets harken back to the '70s and '80s in the early years of artificial intelligence research. The work at that time was constrained by compute capacity and amount of data available. This is the key that has enabled machine learning to leap forward in recent years - both of those constraints no longer hold. Compute cycles and data are available at levels unimagined just decades ago.

The goal of machine learning sounds simple: provide systems with the ability to learn based on the information provided them. Simple as it sounds, this is counter to classic software engineering and has its challenges. Most software development we are familiar with ‘hard codes' the systems behavior based on planned and anticipated user and data interactions. The standard ‘if-then-else' model.

The algorithms required for artificial intelligence/machine learning are much more complex. They need to allow the system to develop its own analytical models based on inputs. Those models are constantly changing based on the information provided. Based on the data and those models, behavior is determined. As you can tell from the description, this results in very non-deterministic behavior. The system will analyze, interpret and react based on the information provided, modify that behavior as more information, and then feedback is provided. The analysis and behavior is constantly changing and being refined over time. Imagine developing the test suite for that system! (A topic for future discussion).

You are already reaping the benefits of machine learning
Do you have a Netflix account? Or Amazon? Both Netflix and Amazon provide a ‘recommended for you' list every time you log in. Both companies have very complex, proprietary algorithms analyzing the huge repository of information about you and all their member's transactions. Based on that information, they develop models of your expected behavior and present a list of recommendations to you. How you react to those recommendations is also fed back into the algorithms, constantly tweaking and adjusting your behavior model.

Or how about your smart phone? Think for a moment about the complexity of the simple statement, "Siri, what is the weather forecast for today?" First the software needs to be able to understand your voice, your accent, and your way of speech in order to be able to determine the actual words being spoken. If it's not sure, the software asks for clarification, and it learns from the clarification. Each time you use it, your phone gets better at understanding what you are saying. Once it understands the words, it has to process natural language into something meaningful to the system. This again requires complex algorithms analyzing the information, creating a model, and executing on its interpretation. As with parsing the words, if it's not sure, the software will prompt for clarification. That clarification will be fed back into the system that models your way of speaking and the context of the language you use.

It's all about the data
In a recent article on TechCrunch, ‘How startups can compete with enterprises in artificial intelligence and machine learning' John Melas-Kyriazi refers to data as the ‘fuel we feed into training machine learning models that can create powerful network effects at scale.' I find that a very apt analogy. The complex algorithms and models are the engine of machine learning, but without fuel, the engine - the data - won't work very well, if at all. A colleague of mine, John Williams, (Chief Strategy Officer at Collaborative Consulting) for years has been fond of saying, "It's all about the data." That could not be more true than in the world of machine learning.

Given the importance of the data to the success of any machine learning implementation, there are some key considerations to take into account:

  • Data Quality - In the world of data, this has always been an important consideration. Data cleansing and scrubbing is standard practice already in many organizations. It has become critical for machine learning implementations. Putting dirty fuel into even the best engine will bring it to a grinding halt.
  • Data Volume - Big Data is tailor-made for machine learning. The more information the algorithms and subsequent models have to work with, the better the results. The key word here is learning. We as individuals learn more as more information is provided to us. This concept is directly applicable in the machine learning world.
  • Data Timeliness - Besides volume, new and timely data is also a consideration. If the machine learning is based on a large volume of data that is completely outdated, the resulting models will not be very useful.
  • Data Pedigree - Where did the data come from? Is it a valid source? The pedigree is less important when using internal systems, as the source is well known, but many machine learning systems will be getting their data from public sources. Or potentially, from the many devices in the world of the Internet of Things. Crowd-sourcing data (for example Waze, a GPS mobile app) requires extra effort to ensure you trust the information being consumed. Imagine a new kind of cyber-attack - feeding your machine learning system bad data to impact the results. Remember Microsoft's problem with its AI Chatbot Tay learning to be a racist?

No technology negates the need for good design and planning
There is no doubt machine learning technology has amazing potential at impacting businesses across the spectrum, whether it will be in healthcare for diagnosing Alzheimer's disease to self-driving cars that were once in the realm of science fiction. No technology negates the need for good design and planning; machine learning is no different. As technologists, it's our responsibility to ensure the proper efforts have been made to supply machine learning implementations with the best fuel possible. Understanding the quality, volume, timeliness, and pedigree needs of these systems can help us navigate this new world of machine learning, leading us to successful execution, and, ultimately, providing value back to the business.

More Stories By Ed Featherston

Ed Featherston is VP, Principal Architect at Cloud Technology Partners. He brings 35 years of technology experience in designing, building, and implementing large complex solutions. He has significant expertise in systems integration, Internet/intranet, and cloud technologies. He has delivered projects in various industries, including financial services, pharmacy, government and retail.

Comments (0)

Share your thoughts on this story.

Add your comment
You must be signed in to add a comment. Sign-in | Register

In accordance with our Comment Policy, we encourage comments that are on topic, relevant and to-the-point. We will remove comments that include profanity, personal attacks, racial slurs, threats of violence, or other inappropriate material that violates our Terms and Conditions, and will block users who make repeated violations. We ask all readers to expect diversity of opinion and to treat one another with dignity and respect.


@CloudExpo Stories
SYS-CON Events announced today that IBM has been named “Diamond Sponsor” of SYS-CON's 21st Cloud Expo, which will take place on October 31 through November 2nd 2017 at the Santa Clara Convention Center in Santa Clara, California.
In his Opening Keynote at 21st Cloud Expo, John Considine, General Manager of IBM Cloud Infrastructure, will lead you through the exciting evolution of the cloud. He'll look at this major disruption from the perspective of technology, business models, and what this means for enterprises of all sizes. John Considine is General Manager of Cloud Infrastructure Services at IBM. In that role he is responsible for leading IBM’s public cloud infrastructure including strategy, development, and offering ...
SYS-CON Events announced today that N3N will exhibit at SYS-CON's @ThingsExpo, which will take place on Oct 31 – Nov 2, 2017, at the Santa Clara Convention Center in Santa Clara, CA. N3N’s solutions increase the effectiveness of operations and control centers, increase the value of IoT investments, and facilitate real-time operational decision making. N3N enables operations teams with a four dimensional digital “big board” that consolidates real-time live video feeds alongside IoT sensor data a...
Gemini is Yahoo’s native and search advertising platform. To ensure the quality of a complex distributed system that spans multiple products and components and across various desktop websites and mobile app and web experiences – both Yahoo owned and operated and third-party syndication (supply), with complex interaction with more than a billion users and numerous advertisers globally (demand) – it becomes imperative to automate a set of end-to-end tests 24x7 to detect bugs and regression. In th...
In a recent survey, Sumo Logic surveyed 1,500 customers who employ cloud services such as Amazon Web Services (AWS), Microsoft Azure, and Google Cloud Platform (GCP). According to the survey, a quarter of the respondents have already deployed Docker containers and nearly as many (23 percent) are employing the AWS Lambda serverless computing framework. It’s clear: serverless is here to stay. The adoption does come with some needed changes, within both application development and operations. Tha...
SYS-CON Events announced today that Avere Systems, a leading provider of enterprise storage for the hybrid cloud, will exhibit at SYS-CON's 21st International Cloud Expo®, which will take place on Oct 31 - Nov 2, 2017, at the Santa Clara Convention Center in Santa Clara, CA. Avere delivers a more modern architectural approach to storage that doesn't require the overprovisioning of storage capacity to achieve performance, overspending on expensive storage media for inactive data or the overbui...
With major technology companies and startups seriously embracing Cloud strategies, now is the perfect time to attend 21st Cloud Expo October 31 - November 2, 2017, at the Santa Clara Convention Center, CA, and June 12-14, 2018, at the Javits Center in New York City, NY, and learn what is going on, contribute to the discussions, and ensure that your enterprise is on the right path to Digital Transformation.
Companies are harnessing data in ways we once associated with science fiction. Analysts have access to a plethora of visualization and reporting tools, but considering the vast amount of data businesses collect and limitations of CPUs, end users are forced to design their structures and systems with limitations. Until now. As the cloud toolkit to analyze data has evolved, GPUs have stepped in to massively parallel SQL, visualization and machine learning.
We all know that end users experience the Internet primarily with mobile devices. From an app development perspective, we know that successfully responding to the needs of mobile customers depends on rapid DevOps – failing fast, in short, until the right solution evolves in your customers' relationship to your business. Whether you’re decomposing an SOA monolith, or developing a new application cloud natively, it’s not a question of using microservices – not doing so will be a path to eventual b...
The dynamic nature of the cloud means that change is a constant when it comes to modern cloud-based infrastructure. Delivering modern applications to end users, therefore, is a constantly shifting challenge. Delivery automation helps IT Ops teams ensure that apps are providing an optimal end user experience over hybrid-cloud and multi-cloud environments, no matter what the current state of the infrastructure is. To employ a delivery automation strategy that reflects your business rules, making r...
Digital transformation is changing the face of business. The IDC predicts that enterprises will commit to a massive new scale of digital transformation, to stake out leadership positions in the "digital transformation economy." Accordingly, attendees at the upcoming Cloud Expo | @ThingsExpo at the Santa Clara Convention Center in Santa Clara, CA, Oct 31-Nov 2, will find fresh new content in a new track called Enterprise Cloud & Digital Transformation.
Most technology leaders, contemporary and from the hardware era, are reshaping their businesses to do software. They hope to capture value from emerging technologies such as IoT, SDN, and AI. Ultimately, irrespective of the vertical, it is about deriving value from independent software applications participating in an ecosystem as one comprehensive solution. In his session at @ThingsExpo, Kausik Sridhar, founder and CTO of Pulzze Systems, will discuss how given the magnitude of today's applicati...
Smart cities have the potential to change our lives at so many levels for citizens: less pollution, reduced parking obstacles, better health, education and more energy savings. Real-time data streaming and the Internet of Things (IoT) possess the power to turn this vision into a reality. However, most organizations today are building their data infrastructure to focus solely on addressing immediate business needs vs. a platform capable of quickly adapting emerging technologies to address future ...
SYS-CON Events announced today that NetApp has been named “Bronze Sponsor” of SYS-CON's 21st International Cloud Expo®, which will take place on Oct 31 – Nov 2, 2017, at the Santa Clara Convention Center in Santa Clara, CA. NetApp is the data authority for hybrid cloud. NetApp provides a full range of hybrid cloud data services that simplify management of applications and data across cloud and on-premises environments to accelerate digital transformation. Together with their partners, NetApp emp...
Enterprises are adopting Kubernetes to accelerate the development and the delivery of cloud-native applications. However, sharing a Kubernetes cluster between members of the same team can be challenging. And, sharing clusters across multiple teams is even harder. Kubernetes offers several constructs to help implement segmentation and isolation. However, these primitives can be complex to understand and apply. As a result, it’s becoming common for enterprises to end up with several clusters. Thi...
As popularity of the smart home is growing and continues to go mainstream, technological factors play a greater role. The IoT protocol houses the interoperability battery consumption, security, and configuration of a smart home device, and it can be difficult for companies to choose the right kind for their product. For both DIY and professionally installed smart homes, developers need to consider each of these elements for their product to be successful in the market and current smart homes.
Widespread fragmentation is stalling the growth of the IIoT and making it difficult for partners to work together. The number of software platforms, apps, hardware and connectivity standards is creating paralysis among businesses that are afraid of being locked into a solution. EdgeX Foundry is unifying the community around a common IoT edge framework and an ecosystem of interoperable components.
Join IBM November 1 at 21st Cloud Expo at the Santa Clara Convention Center in Santa Clara, CA, and learn how IBM Watson can bring cognitive services and AI to intelligent, unmanned systems. Cognitive analysis impacts today’s systems with unparalleled ability that were previously available only to manned, back-end operations. Thanks to cloud processing, IBM Watson can bring cognitive services and AI to intelligent, unmanned systems. Imagine a robot vacuum that becomes your personal assistant th...
Containers are rapidly finding their way into enterprise data centers, but change is difficult. How do enterprises transform their architecture with technologies like containers without losing the reliable components of their current solutions? In his session at @DevOpsSummit at 21st Cloud Expo, Tony Campbell, Director, Educational Services at CoreOS, will explore the challenges organizations are facing today as they move to containers and go over how Kubernetes applications can deploy with lega...
Though cloud is the future of enterprise computing, a smooth transition of legacy applications and systems is critical for seamless business operations. IT professionals are eager to start leveraging the cost, scale and other benefits of cloud, but with massive investments already in place in existing infrastructure and a number of compliance and resource hurdles, it can be challenging to move to a cloud-based infrastructure.