|By Sam Ganga||
|August 12, 2014 02:00 PM EDT||
Science fiction films abound that warn of machines taking control and wreaking havoc on the human race. "2001: A Space Odyssey," "War Games" and "I, Robot" are just a few of the titles that propose what might happen if we hand too much power over to intelligent, interconnected machines.
Decades after the first cautionary tale, the world's machines are more intelligent and more interconnected than even science fiction authors could have predicted. Machine to Machine (M2M) communication and the mobile revolution have led to the phenomenon of Big Data, an influx of structured and unstructured data at volumes and velocities never before heard of. The insightful analysis of all that data is proving to be a blessing to humanity, not the threat that many feared. M2M and Big Data Analytics can help reduce costs and create competitive advantage for a wide variety of businesses.
What Is M2M?
M2M refers to systems and technologies that make it possible for networked devices to exchange information and perform actions on their own, without (or with minimal) human intervention. Gathering sensor data from devices, analyzing it and using it to exercise more intelligent control can drive better outcomes. Everyday examples include:
Smart meters, coupled with predictive analytics, enable utility companies to predict demand patterns, automatically adjust to meet peak demand and avoid over-production when demand is low.
Remote medical sensors can monitor patients, remind them if they've forgotten their medications and alert doctors when intervention might be needed.
Smart buildings have sensors that can analyze environmental data to save energy and improve safety.
Traffic data from networked sensors can be analyzed to predict shifts in traffic patterns. Using this information to control traffic signals can actually prevent traffic jams, not just ease them.
Automated systems like GM's OnStar can alert emergency services when accidents occur, even when the humans involved aren't able to help themselves.
How M2M Came to Be
M2M didn't arrive on the scene overnight; as with anything else, it followed an evolutionary process. Back in the 1980s, Supervisory Control And Data Acquisition (SCADA) systems were introduced to enhance controls for electricity generation, transmission and distribution, and to improve monitoring and control for traffic and transportation systems. In the '90s, Wireless Sensor Networks were introduced to improve monitoring and control in many manufacturing and industrial systems. Wireless made it easier to monitor and control a broader range of devices but only supported limited, short-range connections.
When data modules were introduced in the mid-1990s and early 2000s that could communicate via cellular networks, a major leap forward occurred. These systems were used first to connect point of sale (POS) terminals, vehicle sensors and other remote monitoring and tracking systems, and then were further extended to automatic meter reading, security, elevator control, fleet management, vending and telemedicine.
M2M communication and applications have really exploded in diversity and number since the introduction of the Internet as a backbone for communication. Three major factors have combined to accelerate the recent growth in M2M:
- More data from more devices can be combined and analyzed more quickly due to advances in tools and technologies for big data analysis and predictive analytics. This enables machine-driven actions based on anticipated conditions - not just faster reaction times.
- The "everywhereness" of broadband networks, wireless and Internet has given rise to the Internet of Things (IoT) and has made it easier and cheaper than ever to connect devices. Assign an IP address to a device with Internet access and you can communicate with it anywhere in the world.
- Cheaper and smaller sensors, memory and processing power mean that more devices can be networked, and the devices themselves can be smarter.
M2M Now and in the Future
Gartner Inc. estimates that there are currently just under 30 billion connected devices and projects $309 billion in additional revenue for product and service suppliers by 2020 due to IoT. They also predict $1.9 trillion in total economic impact from improved productivity and cost savings, among other factors.
As an example of IoT's impact, Gartner turns its attention to data centers. The analyst firm predicts that IoT product and service suppliers will generate revenue exceeding $300 billion, mostly in services, by 2020.
How M2M Is Being Applied
With virtually every industry impacted, M2M's technology solutions applications are startling in their breadth and diversity. Machina Research points to benefits as varied as reduced energy costs, improved safety and security, and increased efficiency and faster response times for emergency services and national defense. Here are some examples:
In terms of how far along companies in key verticals are in implementing M2M initiatives, another recent study by Techpro Research offers some insight. Energy, IT and automotive top the list in current implementations, or plans to implement in the next 12 months, followed by Healthcare, Facility Management, Manufacturing and Retail.
M2M Success in the Marketplace
If businesses do thoughtful planning around how to use M2M to achieve their goals, opportunities to boost revenues, cut costs and more effectively serve customers are tremendous. A few recent examples include:
Retail - Nestlé Nespresso SA has equipped its coffee machines used in restaurants, hotels, offices and luxury retail boutiques to transmit operational and performance data from each machine to a cloud platform for tracking and analysis. The system tracks descaling and other maintenance procedures and alerts technical staff if servicing is required. The applications can also be used to remotely adjust water temperature and pressure. The system helps ensure that machines are maintained in excellent condition, that they produce the highest-quality coffee, cup after cup, and that customers are well supplied with their coffee of choice.
Transportation - The automotive industry and the U.S. Federal Government are embracing M2M. The US Department of Transportation recently conducted research that suggests that Vehicle to Vehicle (V2V) technology could prevent the majority of crashes involving two or more vehicles. Sensors can monitor speed and location of nearby vehicles, analyze risks and either warn drivers (near term) or take action on their own (longer term) to avoid accidents. The research could lead to a mandate to use V2V in the future.
Healthcare - Partnering with the University Teaching Hospitals of Grenoble and Toulouse, France Telecom R&D launched a project called "Gluconet" for managing diabetic patients remotely. A special instrument is used to periodically read patient glycemia data. This information gets transmitted automatically to the management center via mobile devices. The doctors can access the information over the Internet. Based on the analysis, doctors send medical advice to patients via SMS or voice messaging. The key advantage here is that both patients and doctors are alerted of any complications well before they become life-threatening.
Consumer - Lexmark, a provider of printing and imaging products, software, solutions and services, deployed M2M for more effective customer servicing. Lexmark uses M2M to collect data from millions of printers. The company analyzes the data to streamline its products to serve customers better, increase revenues and reduce operational costs.
Facilities Management - Commercial real estate services firm Jones Lang LaSalle (JLL) deployed an M2M system called IntelliCommand to collect data from building systems for security and protection against heating, cooling or fire incidents. Information collected by remote sensors is transmitted to a cloud-hosted system for in-depth analysis. When sensors collect data that strays outside of established parameters, alarms are relayed to a control center to alert managers. JLL's pilot installation with four sites enabled clients to cut costs by 15-20 percent. The real estate giant is now extending its deployment to 76 buildings.
How to Begin the Process
M2M possibilities for some organizations are self-evident. An equipment manufacturer might see an opportunity to leverage machine data to provide better service and build loyalty. Another might see an opportunity to add value that can be monetized. Some companies might find themselves threatened by competitors who have already started using M2M to gain advantage. But it's not so cut and dried for some businesses. The "M2M Opportunity Matrix" shown here offers some structure that can be used to think about M2M and identify opportunities that can improve business performance.
Listed across the top of the Matrix are possible business objectives. This isn't an exhaustive list, but you could do a lot of good for your business by finding ways to reduce cost, increase revenue or add value.
Options related to data sources are listed down the left side. Your organization might already have a large database of information that's coming in from POS systems or manufacturing control systems or some other source - Data In-Hand. But maybe you haven't figured out what to do with the information yet. There might be additional data that you could be collecting from existing "sensors" - New Data from Existing Sources. Or there might be new data that you could access with new sensors, or by sourcing from outside your company - New Data from New Sources. Probably, the data you already have in hand is going to be the easiest to tap into to achieve business objectives. But some opportunities might be so valuable that it's worth deploying new sensors to gather new data.
There's a potential M2M opportunity at the juncture of each business objective and data source. So, do some brainstorming. Start the process by thinking of how to leverage different data sources to achieve various business objectives. It can go in a lot of directions from there.
Alternatively, an experienced data consultant can help you look objectively at your situation and help you to identify low-hanging fruit or the really game-changing opportunities that could deliver more transformative results. There are a lot of right answers. The best thing is to get started.
Making the Most of M2M
It turns out that, so far at least, all those cautionary tales about intelligent machines have proven untrue. In fact, interconnected machines and the data they generate are improving the ways we live and do business. Smarter systems that don't need to rely on slower human input and that can more quickly adapt as needed are the upshot of M2M. Even now we are seeing incredible innovations like remote glucose monitoring, more efficient printing and safer buildings. And that's only the beginning. At the risk of imitating sci fi writers who were a bit off-base, we hesitate to predict what other life-enhancing technologies powered by M2M are on the horizon.
You, meanwhile, should not hesitate to take part in the M2M revolution. If you wait for someone else to figure out how to best leverage M2M, you are likely to lose market share or lose the opportunity altogether. It may seem overwhelming to know where to start; if that's the case, work with a data consultant who can help create a plan. Don't let the intelligent machines outsmart you.
Keeping pace with advancements in software delivery processes and tooling is taxing even for the most proficient organizations. Point tools, platforms, open source and the increasing adoption of private and public cloud services requires strong engineering rigor – all in the face of developer demands to use the tools of choice. As Agile has settled in as a mainstream practice, now DevOps has emerged as the next wave to improve software delivery speed and output. To make DevOps work, organization...
Jul. 30, 2016 12:30 AM EDT Reads: 2,327
Predictive analytics tools monitor, report, and troubleshoot in order to make proactive decisions about the health, performance, and utilization of storage. Most enterprises combine cloud and on-premise storage, resulting in blended environments of physical, virtual, cloud, and other platforms, which justifies more sophisticated storage analytics. In his session at 18th Cloud Expo, Peter McCallum, Vice President of Datacenter Solutions at FalconStor, discussed using predictive analytics to mon...
Jul. 30, 2016 12:30 AM EDT Reads: 2,128
The IoT is changing the way enterprises conduct business. In his session at @ThingsExpo, Eric Hoffman, Vice President at EastBanc Technologies, discussed how businesses can gain an edge over competitors by empowering consumers to take control through IoT. He cited examples such as a Washington, D.C.-based sports club that leveraged IoT and the cloud to develop a comprehensive booking system. He also highlighted how IoT can revitalize and restore outdated business models, making them profitable ...
Jul. 29, 2016 10:15 PM EDT Reads: 2,067
SYS-CON Events announced today that Isomorphic Software will exhibit at DevOps Summit at 19th International Cloud Expo, which will take place on November 1–3, 2016, at the Santa Clara Convention Center in Santa Clara, CA. Isomorphic Software provides the SmartClient HTML5/AJAX platform, the most advanced technology for building rich, cutting-edge enterprise web applications for desktop and mobile. SmartClient combines the productivity and performance of traditional desktop software with the simp...
Jul. 29, 2016 10:00 PM EDT Reads: 1,285
"When you think about the data center today, there's constant evolution, The evolution of the data center and the needs of the consumer of technology change, and they change constantly," stated Matt Kalmenson, VP of Sales, Service and Cloud Providers at Veeam Software, in this SYS-CON.tv interview at 18th Cloud Expo, held June 7-9, 2016, at the Javits Center in New York City, NY.
Jul. 29, 2016 09:45 PM EDT Reads: 1,495
Internet of @ThingsExpo, taking place November 1-3, 2016, at the Santa Clara Convention Center in Santa Clara, CA, is co-located with the 19th International Cloud Expo and will feature technical sessions from a rock star conference faculty and the leading industry players in the world and ThingsExpo Silicon Valley Call for Papers is now open.
Jul. 29, 2016 08:00 PM EDT Reads: 2,746
We all know the latest numbers: Gartner, Inc. forecasts that 6.4 billion connected things will be in use worldwide in 2016, up 30 percent from last year, and will reach 20.8 billion by 2020. We're rapidly approaching a data production of 40 zettabytes a day – more than we can every physically store, and exabytes and yottabytes are just around the corner. For many that’s a good sign, as data has been proven to equal money – IF it’s ingested, integrated, and analyzed fast enough. Without real-ti...
Jul. 29, 2016 07:45 PM EDT Reads: 1,179
Actian Corporation has announced the latest version of the Actian Vector in Hadoop (VectorH) database, generally available at the end of July. VectorH is based on the same query engine that powers Actian Vector, which recently doubled the TPC-H benchmark record for non-clustered systems at the 3000GB scale factor (see tpc.org/3323). The ability to easily ingest information from different data sources and rapidly develop queries to make better business decisions is becoming increasingly importan...
Jul. 29, 2016 06:15 PM EDT Reads: 957
SYS-CON Events announced today that 910Telecom will exhibit at the 19th International Cloud Expo, which will take place on November 1–3, 2016, at the Santa Clara Convention Center in Santa Clara, CA. Housed in the classic Denver Gas & Electric Building, 910 15th St., 910Telecom is a carrier-neutral telecom hotel located in the heart of Denver. Adjacent to CenturyLink, AT&T, and Denver Main, 910Telecom offers connectivity to all major carriers, Internet service providers, Internet backbones and ...
Jul. 29, 2016 06:00 PM EDT Reads: 931
[guide] Cloud Analytics for Dummies | @CloudExpo @Informatica #API #Cloud #Analytics #BusinessIntelligence
Cloud analytics is dramatically altering business intelligence. Some businesses will capitalize on these promising new technologies and gain key insights that’ll help them gain competitive advantage. And others won’t. Whether you’re a business leader, an IT manager, or an analyst, we want to help you and the people you need to influence with a free copy of “Cloud Analytics for Dummies,” the essential guide to this explosive new space for business intelligence.
Jul. 29, 2016 05:15 PM EDT Reads: 988
You think you know what’s in your data. But do you? Most organizations are now aware of the business intelligence represented by their data. Data science stands to take this to a level you never thought of – literally. The techniques of data science, when used with the capabilities of Big Data technologies, can make connections you had not yet imagined, helping you discover new insights and ask new questions of your data. In his session at @ThingsExpo, Sarbjit Sarkaria, data science team lead ...
Jul. 29, 2016 04:45 PM EDT Reads: 1,251
Qosmos has announced new milestones in the detection of encrypted traffic and in protocol signature coverage. Qosmos latest software can accurately classify traffic encrypted with SSL/TLS (e.g., Google, Facebook, WhatsApp), P2P traffic (e.g., BitTorrent, MuTorrent, Vuze), and Skype, while preserving the privacy of communication content. These new classification techniques mean that traffic optimization, policy enforcement, and user experience are largely unaffected by encryption. In respect wit...
Jul. 29, 2016 04:15 PM EDT Reads: 472
ReadyTalk has expanded the capabilities of the FoxDen collaboration platform announced late last year to include FoxDen Connect, an in-room video collaboration experience that launches with a single touch. With FoxDen Connect, users can now not only engage in HD video conferencing between iOS and Android mobile devices or Chrome browsers, but also set up in-person meeting rooms for video interactions. A host’s mobile device automatically recognizes the presence of a meeting room via beacon tech...
Jul. 29, 2016 04:15 PM EDT Reads: 430
Manufacturers are embracing the Industrial Internet the same way consumers are leveraging Fitbits – to improve overall health and wellness. Both can provide consistent measurement, visibility, and suggest performance improvements customized to help reach goals. Fitbit users can view real-time data and make adjustments to increase their activity. In his session at @ThingsExpo, Mark Bernardo Professional Services Leader, Americas, at GE Digital, discussed how leveraging the Industrial Internet a...
Jul. 29, 2016 03:15 PM EDT Reads: 675
On Dice.com, the number of job postings asking for skill in Amazon Web Services increased 76 percent between June 2015 and June 2016. Salesforce.com saw its own skill mentions increase 37 percent, while DevOps and Cloud rose 35 percent and 28 percent, respectively. Even as they expand their presence in the cloud, companies are also looking for tech professionals who can manage projects, crunch data, and figure out how to make systems run more autonomously. Mentions of ‘data science’ as a skill ...
Jul. 29, 2016 03:00 PM EDT Reads: 515
In his session at Cloud Expo, Robert Cohen, an economist and senior fellow at the Economic Strategy Institute, will provide economic scenarios that describe how the rapid adoption of software-defined everything including cloud services, SDDC and open networking will change GDP, industry growth, productivity and jobs. This session will also include a drill down for several industries such as finance, social media, cloud service providers and pharmaceuticals.
Jul. 29, 2016 02:15 PM EDT Reads: 409
IoT generates lots of temporal data. But how do you unlock its value? You need to discover patterns that are repeatable in vast quantities of data, understand their meaning, and implement scalable monitoring across multiple data streams in order to monetize the discoveries and insights. Motif discovery and deep learning platforms are emerging to visualize sensor data, to search for patterns and to build application that can monitor real time streams efficiently. In his session at @ThingsExpo, ...
Jul. 29, 2016 02:00 PM EDT Reads: 1,208
Choosing the right cloud for your workloads is a balancing act that can cost your organization time, money and aggravation - unless you get it right the first time. Economics, speed, performance, accessibility, administrative needs and security all play a vital role in dictating your approach to the cloud. Without knowing the right questions to ask, you could wind up paying for capacity you'll never need or underestimating the resources required to run your applications.
Jul. 29, 2016 01:00 PM EDT Reads: 752
Security, data privacy, reliability and regulatory compliance are critical factors when evaluating whether to move business applications from in-house client hosted environments to a cloud platform. In her session at 18th Cloud Expo, Vandana Viswanathan, Associate Director at Cognizant, In this session, will provide an orientation to the five stages required to implement a cloud hosted solution validation strategy.
Jul. 29, 2016 01:00 PM EDT Reads: 1,202
"delaPlex is a software development company. We do team-based outsourcing development," explained Mark Rivers, COO and Co-founder of delaPlex Software, in this SYS-CON.tv interview at 18th Cloud Expo, held June 7-9, 2016, at the Javits Center in New York City, NY.
Jul. 29, 2016 01:00 PM EDT Reads: 2,130