|By Sam Ganga||
|August 12, 2014 02:00 PM EDT||
Science fiction films abound that warn of machines taking control and wreaking havoc on the human race. "2001: A Space Odyssey," "War Games" and "I, Robot" are just a few of the titles that propose what might happen if we hand too much power over to intelligent, interconnected machines.
Decades after the first cautionary tale, the world's machines are more intelligent and more interconnected than even science fiction authors could have predicted. Machine to Machine (M2M) communication and the mobile revolution have led to the phenomenon of Big Data, an influx of structured and unstructured data at volumes and velocities never before heard of. The insightful analysis of all that data is proving to be a blessing to humanity, not the threat that many feared. M2M and Big Data Analytics can help reduce costs and create competitive advantage for a wide variety of businesses.
What Is M2M?
M2M refers to systems and technologies that make it possible for networked devices to exchange information and perform actions on their own, without (or with minimal) human intervention. Gathering sensor data from devices, analyzing it and using it to exercise more intelligent control can drive better outcomes. Everyday examples include:
Smart meters, coupled with predictive analytics, enable utility companies to predict demand patterns, automatically adjust to meet peak demand and avoid over-production when demand is low.
Remote medical sensors can monitor patients, remind them if they've forgotten their medications and alert doctors when intervention might be needed.
Smart buildings have sensors that can analyze environmental data to save energy and improve safety.
Traffic data from networked sensors can be analyzed to predict shifts in traffic patterns. Using this information to control traffic signals can actually prevent traffic jams, not just ease them.
Automated systems like GM's OnStar can alert emergency services when accidents occur, even when the humans involved aren't able to help themselves.
How M2M Came to Be
M2M didn't arrive on the scene overnight; as with anything else, it followed an evolutionary process. Back in the 1980s, Supervisory Control And Data Acquisition (SCADA) systems were introduced to enhance controls for electricity generation, transmission and distribution, and to improve monitoring and control for traffic and transportation systems. In the '90s, Wireless Sensor Networks were introduced to improve monitoring and control in many manufacturing and industrial systems. Wireless made it easier to monitor and control a broader range of devices but only supported limited, short-range connections.
When data modules were introduced in the mid-1990s and early 2000s that could communicate via cellular networks, a major leap forward occurred. These systems were used first to connect point of sale (POS) terminals, vehicle sensors and other remote monitoring and tracking systems, and then were further extended to automatic meter reading, security, elevator control, fleet management, vending and telemedicine.
M2M communication and applications have really exploded in diversity and number since the introduction of the Internet as a backbone for communication. Three major factors have combined to accelerate the recent growth in M2M:
- More data from more devices can be combined and analyzed more quickly due to advances in tools and technologies for big data analysis and predictive analytics. This enables machine-driven actions based on anticipated conditions - not just faster reaction times.
- The "everywhereness" of broadband networks, wireless and Internet has given rise to the Internet of Things (IoT) and has made it easier and cheaper than ever to connect devices. Assign an IP address to a device with Internet access and you can communicate with it anywhere in the world.
- Cheaper and smaller sensors, memory and processing power mean that more devices can be networked, and the devices themselves can be smarter.
M2M Now and in the Future
Gartner Inc. estimates that there are currently just under 30 billion connected devices and projects $309 billion in additional revenue for product and service suppliers by 2020 due to IoT. They also predict $1.9 trillion in total economic impact from improved productivity and cost savings, among other factors.
As an example of IoT's impact, Gartner turns its attention to data centers. The analyst firm predicts that IoT product and service suppliers will generate revenue exceeding $300 billion, mostly in services, by 2020.
How M2M Is Being Applied
With virtually every industry impacted, M2M's technology solutions applications are startling in their breadth and diversity. Machina Research points to benefits as varied as reduced energy costs, improved safety and security, and increased efficiency and faster response times for emergency services and national defense. Here are some examples:
In terms of how far along companies in key verticals are in implementing M2M initiatives, another recent study by Techpro Research offers some insight. Energy, IT and automotive top the list in current implementations, or plans to implement in the next 12 months, followed by Healthcare, Facility Management, Manufacturing and Retail.
M2M Success in the Marketplace
If businesses do thoughtful planning around how to use M2M to achieve their goals, opportunities to boost revenues, cut costs and more effectively serve customers are tremendous. A few recent examples include:
Retail - Nestlé Nespresso SA has equipped its coffee machines used in restaurants, hotels, offices and luxury retail boutiques to transmit operational and performance data from each machine to a cloud platform for tracking and analysis. The system tracks descaling and other maintenance procedures and alerts technical staff if servicing is required. The applications can also be used to remotely adjust water temperature and pressure. The system helps ensure that machines are maintained in excellent condition, that they produce the highest-quality coffee, cup after cup, and that customers are well supplied with their coffee of choice.
Transportation - The automotive industry and the U.S. Federal Government are embracing M2M. The US Department of Transportation recently conducted research that suggests that Vehicle to Vehicle (V2V) technology could prevent the majority of crashes involving two or more vehicles. Sensors can monitor speed and location of nearby vehicles, analyze risks and either warn drivers (near term) or take action on their own (longer term) to avoid accidents. The research could lead to a mandate to use V2V in the future.
Healthcare - Partnering with the University Teaching Hospitals of Grenoble and Toulouse, France Telecom R&D launched a project called "Gluconet" for managing diabetic patients remotely. A special instrument is used to periodically read patient glycemia data. This information gets transmitted automatically to the management center via mobile devices. The doctors can access the information over the Internet. Based on the analysis, doctors send medical advice to patients via SMS or voice messaging. The key advantage here is that both patients and doctors are alerted of any complications well before they become life-threatening.
Consumer - Lexmark, a provider of printing and imaging products, software, solutions and services, deployed M2M for more effective customer servicing. Lexmark uses M2M to collect data from millions of printers. The company analyzes the data to streamline its products to serve customers better, increase revenues and reduce operational costs.
Facilities Management - Commercial real estate services firm Jones Lang LaSalle (JLL) deployed an M2M system called IntelliCommand to collect data from building systems for security and protection against heating, cooling or fire incidents. Information collected by remote sensors is transmitted to a cloud-hosted system for in-depth analysis. When sensors collect data that strays outside of established parameters, alarms are relayed to a control center to alert managers. JLL's pilot installation with four sites enabled clients to cut costs by 15-20 percent. The real estate giant is now extending its deployment to 76 buildings.
How to Begin the Process
M2M possibilities for some organizations are self-evident. An equipment manufacturer might see an opportunity to leverage machine data to provide better service and build loyalty. Another might see an opportunity to add value that can be monetized. Some companies might find themselves threatened by competitors who have already started using M2M to gain advantage. But it's not so cut and dried for some businesses. The "M2M Opportunity Matrix" shown here offers some structure that can be used to think about M2M and identify opportunities that can improve business performance.
Listed across the top of the Matrix are possible business objectives. This isn't an exhaustive list, but you could do a lot of good for your business by finding ways to reduce cost, increase revenue or add value.
Options related to data sources are listed down the left side. Your organization might already have a large database of information that's coming in from POS systems or manufacturing control systems or some other source - Data In-Hand. But maybe you haven't figured out what to do with the information yet. There might be additional data that you could be collecting from existing "sensors" - New Data from Existing Sources. Or there might be new data that you could access with new sensors, or by sourcing from outside your company - New Data from New Sources. Probably, the data you already have in hand is going to be the easiest to tap into to achieve business objectives. But some opportunities might be so valuable that it's worth deploying new sensors to gather new data.
There's a potential M2M opportunity at the juncture of each business objective and data source. So, do some brainstorming. Start the process by thinking of how to leverage different data sources to achieve various business objectives. It can go in a lot of directions from there.
Alternatively, an experienced data consultant can help you look objectively at your situation and help you to identify low-hanging fruit or the really game-changing opportunities that could deliver more transformative results. There are a lot of right answers. The best thing is to get started.
Making the Most of M2M
It turns out that, so far at least, all those cautionary tales about intelligent machines have proven untrue. In fact, interconnected machines and the data they generate are improving the ways we live and do business. Smarter systems that don't need to rely on slower human input and that can more quickly adapt as needed are the upshot of M2M. Even now we are seeing incredible innovations like remote glucose monitoring, more efficient printing and safer buildings. And that's only the beginning. At the risk of imitating sci fi writers who were a bit off-base, we hesitate to predict what other life-enhancing technologies powered by M2M are on the horizon.
You, meanwhile, should not hesitate to take part in the M2M revolution. If you wait for someone else to figure out how to best leverage M2M, you are likely to lose market share or lose the opportunity altogether. It may seem overwhelming to know where to start; if that's the case, work with a data consultant who can help create a plan. Don't let the intelligent machines outsmart you.
Internet-of-Things discussions can end up either going down the consumer gadget rabbit hole or focused on the sort of data logging that industrial manufacturers have been doing forever. However, in fact, companies today are already using IoT data both to optimize their operational technology and to improve the experience of customer interactions in novel ways. In his session at @ThingsExpo, Gordon Haff, Red Hat Technology Evangelist, will share examples from a wide range of industries – includin...
Dec. 5, 2016 04:15 AM EST Reads: 1,608
Organizations planning enterprise data center consolidation and modernization projects are faced with a challenging, costly reality. Requirements to deploy modern, cloud-native applications simultaneously with traditional client/server applications are almost impossible to achieve with hardware-centric enterprise infrastructure. Compute and network infrastructure are fast moving down a software-defined path, but storage has been a laggard. Until now.
Dec. 5, 2016 04:00 AM EST Reads: 5,400
We're entering the post-smartphone era, where wearable gadgets from watches and fitness bands to glasses and health aids will power the next technological revolution. With mass adoption of wearable devices comes a new data ecosystem that must be protected. Wearables open new pathways that facilitate the tracking, sharing and storing of consumers’ personal health, location and daily activity data. Consumers have some idea of the data these devices capture, but most don’t realize how revealing and...
Dec. 5, 2016 04:00 AM EST Reads: 5,087
Unless your company can spend a lot of money on new technology, re-engineering your environment and hiring a comprehensive cybersecurity team, you will most likely move to the cloud or seek external service partnerships. In his session at 18th Cloud Expo, Darren Guccione, CEO of Keeper Security, revealed what you need to know when it comes to encryption in the cloud.
Dec. 5, 2016 04:00 AM EST Reads: 4,676
"We build IoT infrastructure products - when you have to integrate different devices, different systems and cloud you have to build an application to do that but we eliminate the need to build an application. Our products can integrate any device, any system, any cloud regardless of protocol," explained Peter Jung, Chief Product Officer at Pulzze Systems, in this SYS-CON.tv interview at @ThingsExpo, held November 1-3, 2016, at the Santa Clara Convention Center in Santa Clara, CA.
Dec. 5, 2016 03:30 AM EST Reads: 934
"We are an all-flash array storage provider but our focus has been on VM-aware storage specifically for virtualized applications," stated Dhiraj Sehgal of Tintri in this SYS-CON.tv interview at 19th Cloud Expo, held November 1-3, 2016, at the Santa Clara Convention Center in Santa Clara, CA.
Dec. 5, 2016 03:30 AM EST Reads: 623
It's easy to assume that your app will run on a fast and reliable network. The reality for your app's users, though, is often a slow, unreliable network with spotty coverage. What happens when the network doesn't work, or when the device is in airplane mode? You get unhappy, frustrated users. An offline-first app is an app that works, without error, when there is no network connection. In his session at 18th Cloud Expo, Bradley Holt, a Developer Advocate with IBM Cloud Data Services, discussed...
Dec. 5, 2016 01:00 AM EST Reads: 3,326
Data is the fuel that drives the machine learning algorithmic engines and ultimately provides the business value. In his session at 20th Cloud Expo, Ed Featherston, director/senior enterprise architect at Collaborative Consulting, will discuss the key considerations around quality, volume, timeliness, and pedigree that must be dealt with in order to properly fuel that engine.
Dec. 5, 2016 12:45 AM EST Reads: 1,569
Between 2005 and 2020, data volumes will grow by a factor of 300 – enough data to stack CDs from the earth to the moon 162 times. This has come to be known as the ‘big data’ phenomenon. Unfortunately, traditional approaches to handling, storing and analyzing data aren’t adequate at this scale: they’re too costly, slow and physically cumbersome to keep up. Fortunately, in response a new breed of technology has emerged that is cheaper, faster and more scalable. Yet, in meeting these new needs they...
Dec. 5, 2016 12:45 AM EST Reads: 1,806
In addition to all the benefits, IoT is also bringing new kind of customer experience challenges - cars that unlock themselves, thermostats turning houses into saunas and baby video monitors broadcasting over the internet. This list can only increase because while IoT services should be intuitive and simple to use, the delivery ecosystem is a myriad of potential problems as IoT explodes complexity. So finding a performance issue is like finding the proverbial needle in the haystack.
Dec. 5, 2016 12:30 AM EST Reads: 6,083
When it comes to cloud computing, the ability to turn massive amounts of compute cores on and off on demand sounds attractive to IT staff, who need to manage peaks and valleys in user activity. With cloud bursting, the majority of the data can stay on premises while tapping into compute from public cloud providers, reducing risk and minimizing need to move large files. In his session at 18th Cloud Expo, Scott Jeschonek, Director of Product Management at Avere Systems, discussed the IT and busin...
Dec. 5, 2016 12:15 AM EST Reads: 3,815
According to Forrester Research, every business will become either a digital predator or digital prey by 2020. To avoid demise, organizations must rapidly create new sources of value in their end-to-end customer experiences. True digital predators also must break down information and process silos and extend digital transformation initiatives to empower employees with the digital resources needed to win, serve, and retain customers.
Dec. 5, 2016 12:15 AM EST Reads: 1,159
"We are the public cloud providers. We are currently providing 50% of the resources they need for doing e-commerce business in China and we are hosting about 60% of mobile gaming in China," explained Yi Zheng, CPO and VP of Engineering at CDS Global Cloud, in this SYS-CON.tv interview at 19th Cloud Expo, held November 1-3, 2016, at the Santa Clara Convention Center in Santa Clara, CA.
Dec. 4, 2016 11:45 PM EST Reads: 920
"Once customers get a year into their IoT deployments, they start to realize that they may have been shortsighted in the ways they built out their deployment and the key thing I see a lot of people looking at is - how can I take equipment data, pull it back in an IoT solution and show it in a dashboard," stated Dave McCarthy, Director of Products at Bsquare Corporation, in this SYS-CON.tv interview at @ThingsExpo, held November 1-3, 2016, at the Santa Clara Convention Center in Santa Clara, CA.
Dec. 4, 2016 10:45 PM EST Reads: 1,001
@DevOpsSummit taking place June 6-8, 2017 at Javits Center, New York City, is co-located with the 20th International Cloud Expo and will feature technical sessions from a rock star conference faculty and the leading industry players in the world. @DevOpsSummit at Cloud Expo New York Call for Papers is now open.
Dec. 4, 2016 08:30 PM EST Reads: 1,807
Predictive analytics tools monitor, report, and troubleshoot in order to make proactive decisions about the health, performance, and utilization of storage. Most enterprises combine cloud and on-premise storage, resulting in blended environments of physical, virtual, cloud, and other platforms, which justifies more sophisticated storage analytics. In his session at 18th Cloud Expo, Peter McCallum, Vice President of Datacenter Solutions at FalconStor, discussed using predictive analytics to mon...
Dec. 4, 2016 07:00 PM EST Reads: 4,921
Today we can collect lots and lots of performance data. We build beautiful dashboards and even have fancy query languages to access and transform the data. Still performance data is a secret language only a couple of people understand. The more business becomes digital the more stakeholders are interested in this data including how it relates to business. Some of these people have never used a monitoring tool before. They have a question on their mind like “How is my application doing” but no id...
Dec. 4, 2016 06:30 PM EST Reads: 2,179
@GonzalezCarmen has been ranked the Number One Influencer and @ThingsExpo has been named the Number One Brand in the “M2M 2016: Top 100 Influencers and Brands” by Onalytica. Onalytica analyzed tweets over the last 6 months mentioning the keywords M2M OR “Machine to Machine.” They then identified the top 100 most influential brands and individuals leading the discussion on Twitter.
Dec. 4, 2016 06:30 PM EST Reads: 2,036
As data explodes in quantity, importance and from new sources, the need for managing and protecting data residing across physical, virtual, and cloud environments grow with it. Managing data includes protecting it, indexing and classifying it for true, long-term management, compliance and E-Discovery. Commvault can ensure this with a single pane of glass solution – whether in a private cloud, a Service Provider delivered public cloud or a hybrid cloud environment – across the heterogeneous enter...
Dec. 4, 2016 06:00 PM EST Reads: 1,545
In IT, we sometimes coin terms for things before we know exactly what they are and how they’ll be used. The resulting terms may capture a common set of aspirations and goals – as “cloud” did broadly for on-demand, self-service, and flexible computing. But such a term can also lump together diverse and even competing practices, technologies, and priorities to the point where important distinctions are glossed over and lost.
Dec. 4, 2016 05:45 PM EST Reads: 1,531