|By Sam Ganga||
|August 12, 2014 02:00 PM EDT||
Science fiction films abound that warn of machines taking control and wreaking havoc on the human race. "2001: A Space Odyssey," "War Games" and "I, Robot" are just a few of the titles that propose what might happen if we hand too much power over to intelligent, interconnected machines.
Decades after the first cautionary tale, the world's machines are more intelligent and more interconnected than even science fiction authors could have predicted. Machine to Machine (M2M) communication and the mobile revolution have led to the phenomenon of Big Data, an influx of structured and unstructured data at volumes and velocities never before heard of. The insightful analysis of all that data is proving to be a blessing to humanity, not the threat that many feared. M2M and Big Data Analytics can help reduce costs and create competitive advantage for a wide variety of businesses.
What Is M2M?
M2M refers to systems and technologies that make it possible for networked devices to exchange information and perform actions on their own, without (or with minimal) human intervention. Gathering sensor data from devices, analyzing it and using it to exercise more intelligent control can drive better outcomes. Everyday examples include:
Smart meters, coupled with predictive analytics, enable utility companies to predict demand patterns, automatically adjust to meet peak demand and avoid over-production when demand is low.
Remote medical sensors can monitor patients, remind them if they've forgotten their medications and alert doctors when intervention might be needed.
Smart buildings have sensors that can analyze environmental data to save energy and improve safety.
Traffic data from networked sensors can be analyzed to predict shifts in traffic patterns. Using this information to control traffic signals can actually prevent traffic jams, not just ease them.
Automated systems like GM's OnStar can alert emergency services when accidents occur, even when the humans involved aren't able to help themselves.
How M2M Came to Be
M2M didn't arrive on the scene overnight; as with anything else, it followed an evolutionary process. Back in the 1980s, Supervisory Control And Data Acquisition (SCADA) systems were introduced to enhance controls for electricity generation, transmission and distribution, and to improve monitoring and control for traffic and transportation systems. In the '90s, Wireless Sensor Networks were introduced to improve monitoring and control in many manufacturing and industrial systems. Wireless made it easier to monitor and control a broader range of devices but only supported limited, short-range connections.
When data modules were introduced in the mid-1990s and early 2000s that could communicate via cellular networks, a major leap forward occurred. These systems were used first to connect point of sale (POS) terminals, vehicle sensors and other remote monitoring and tracking systems, and then were further extended to automatic meter reading, security, elevator control, fleet management, vending and telemedicine.
M2M communication and applications have really exploded in diversity and number since the introduction of the Internet as a backbone for communication. Three major factors have combined to accelerate the recent growth in M2M:
- More data from more devices can be combined and analyzed more quickly due to advances in tools and technologies for big data analysis and predictive analytics. This enables machine-driven actions based on anticipated conditions - not just faster reaction times.
- The "everywhereness" of broadband networks, wireless and Internet has given rise to the Internet of Things (IoT) and has made it easier and cheaper than ever to connect devices. Assign an IP address to a device with Internet access and you can communicate with it anywhere in the world.
- Cheaper and smaller sensors, memory and processing power mean that more devices can be networked, and the devices themselves can be smarter.
M2M Now and in the Future
Gartner Inc. estimates that there are currently just under 30 billion connected devices and projects $309 billion in additional revenue for product and service suppliers by 2020 due to IoT. They also predict $1.9 trillion in total economic impact from improved productivity and cost savings, among other factors.
As an example of IoT's impact, Gartner turns its attention to data centers. The analyst firm predicts that IoT product and service suppliers will generate revenue exceeding $300 billion, mostly in services, by 2020.
How M2M Is Being Applied
With virtually every industry impacted, M2M's technology solutions applications are startling in their breadth and diversity. Machina Research points to benefits as varied as reduced energy costs, improved safety and security, and increased efficiency and faster response times for emergency services and national defense. Here are some examples:
In terms of how far along companies in key verticals are in implementing M2M initiatives, another recent study by Techpro Research offers some insight. Energy, IT and automotive top the list in current implementations, or plans to implement in the next 12 months, followed by Healthcare, Facility Management, Manufacturing and Retail.
M2M Success in the Marketplace
If businesses do thoughtful planning around how to use M2M to achieve their goals, opportunities to boost revenues, cut costs and more effectively serve customers are tremendous. A few recent examples include:
Retail - Nestlé Nespresso SA has equipped its coffee machines used in restaurants, hotels, offices and luxury retail boutiques to transmit operational and performance data from each machine to a cloud platform for tracking and analysis. The system tracks descaling and other maintenance procedures and alerts technical staff if servicing is required. The applications can also be used to remotely adjust water temperature and pressure. The system helps ensure that machines are maintained in excellent condition, that they produce the highest-quality coffee, cup after cup, and that customers are well supplied with their coffee of choice.
Transportation - The automotive industry and the U.S. Federal Government are embracing M2M. The US Department of Transportation recently conducted research that suggests that Vehicle to Vehicle (V2V) technology could prevent the majority of crashes involving two or more vehicles. Sensors can monitor speed and location of nearby vehicles, analyze risks and either warn drivers (near term) or take action on their own (longer term) to avoid accidents. The research could lead to a mandate to use V2V in the future.
Healthcare - Partnering with the University Teaching Hospitals of Grenoble and Toulouse, France Telecom R&D launched a project called "Gluconet" for managing diabetic patients remotely. A special instrument is used to periodically read patient glycemia data. This information gets transmitted automatically to the management center via mobile devices. The doctors can access the information over the Internet. Based on the analysis, doctors send medical advice to patients via SMS or voice messaging. The key advantage here is that both patients and doctors are alerted of any complications well before they become life-threatening.
Consumer - Lexmark, a provider of printing and imaging products, software, solutions and services, deployed M2M for more effective customer servicing. Lexmark uses M2M to collect data from millions of printers. The company analyzes the data to streamline its products to serve customers better, increase revenues and reduce operational costs.
Facilities Management - Commercial real estate services firm Jones Lang LaSalle (JLL) deployed an M2M system called IntelliCommand to collect data from building systems for security and protection against heating, cooling or fire incidents. Information collected by remote sensors is transmitted to a cloud-hosted system for in-depth analysis. When sensors collect data that strays outside of established parameters, alarms are relayed to a control center to alert managers. JLL's pilot installation with four sites enabled clients to cut costs by 15-20 percent. The real estate giant is now extending its deployment to 76 buildings.
How to Begin the Process
M2M possibilities for some organizations are self-evident. An equipment manufacturer might see an opportunity to leverage machine data to provide better service and build loyalty. Another might see an opportunity to add value that can be monetized. Some companies might find themselves threatened by competitors who have already started using M2M to gain advantage. But it's not so cut and dried for some businesses. The "M2M Opportunity Matrix" shown here offers some structure that can be used to think about M2M and identify opportunities that can improve business performance.
Listed across the top of the Matrix are possible business objectives. This isn't an exhaustive list, but you could do a lot of good for your business by finding ways to reduce cost, increase revenue or add value.
Options related to data sources are listed down the left side. Your organization might already have a large database of information that's coming in from POS systems or manufacturing control systems or some other source - Data In-Hand. But maybe you haven't figured out what to do with the information yet. There might be additional data that you could be collecting from existing "sensors" - New Data from Existing Sources. Or there might be new data that you could access with new sensors, or by sourcing from outside your company - New Data from New Sources. Probably, the data you already have in hand is going to be the easiest to tap into to achieve business objectives. But some opportunities might be so valuable that it's worth deploying new sensors to gather new data.
There's a potential M2M opportunity at the juncture of each business objective and data source. So, do some brainstorming. Start the process by thinking of how to leverage different data sources to achieve various business objectives. It can go in a lot of directions from there.
Alternatively, an experienced data consultant can help you look objectively at your situation and help you to identify low-hanging fruit or the really game-changing opportunities that could deliver more transformative results. There are a lot of right answers. The best thing is to get started.
Making the Most of M2M
It turns out that, so far at least, all those cautionary tales about intelligent machines have proven untrue. In fact, interconnected machines and the data they generate are improving the ways we live and do business. Smarter systems that don't need to rely on slower human input and that can more quickly adapt as needed are the upshot of M2M. Even now we are seeing incredible innovations like remote glucose monitoring, more efficient printing and safer buildings. And that's only the beginning. At the risk of imitating sci fi writers who were a bit off-base, we hesitate to predict what other life-enhancing technologies powered by M2M are on the horizon.
You, meanwhile, should not hesitate to take part in the M2M revolution. If you wait for someone else to figure out how to best leverage M2M, you are likely to lose market share or lose the opportunity altogether. It may seem overwhelming to know where to start; if that's the case, work with a data consultant who can help create a plan. Don't let the intelligent machines outsmart you.
Docker has acquired software-defined networking (SDN) startup SocketPlane. SocketPlane, which was founded in Q4, 2014, with a vision of delivering Docker-native networking, has been an active participant in shaping the initial efforts around Docker’s open API for networking. The explicit focus of the SocketPlane team within Docker will be on collaborating with the partner community to complete a rich set of networking APIs that addresses the needs of application developers and network and system...
Mar. 5, 2015 09:30 AM EST Reads: 882
The excitement around the possibilities enabled by Big Data is being tempered by the daunting task of feeding the analytics engines with high quality data on a continuous basis. As the once distinct fields of data integration and data management increasingly converge, cloud-based data solutions providers have emerged that can buffer your organization from the complexities of this continuous data cleansing and management so that you’re free to focus on the end goal: actionable insight.
Mar. 5, 2015 09:30 AM EST Reads: 1,755
With several hundred implementations of IoT-enabled solutions in the past 12 months alone, this session will focus on experience over the art of the possible. Many can only imagine the most advanced telematics platform ever deployed, supporting millions of customers, producing tens of thousands events or GBs per trip, and hundreds of TBs per month. With the ability to support a billion sensor events per second, over 30PB of warm data for analytics, and hundreds of PBs for an data analytics arc...
Mar. 5, 2015 09:00 AM EST Reads: 1,455
The Internet of Things (IoT) is causing data centers to become radically decentralized and atomized within a new paradigm known as “fog computing.” To support IoT applications, such as connected cars and smart grids, data centers' core functions will be decentralized out to the network's edges and endpoints (aka “fogs”). As this trend takes hold, Big Data analytics platforms will focus on high-volume log analysis (aka “logs”) and rely heavily on cognitive-computing algorithms (aka “cogs”) to mak...
Mar. 5, 2015 09:00 AM EST Reads: 1,323
Cryptography has become one of the most underappreciated, misunderstood components of technology. It’s too easy for salespeople to dismiss concerns with three letters that nobody wants to question. ‘Yes, of course, we use AES.’ But what exactly are you trusting to be the ultimate guardian of your data? Let’s face it – you probably don’t know. An organic, grass-fed Kobe steak is a far cry from a Big Mac, but they’re both beef, right? Not exactly. Crypto is the same way. The US government require...
Mar. 5, 2015 07:30 AM EST Reads: 1,985
The speed of product development has increased massively in the past 10 years. At the same time our formal secure development and SDL methodologies have fallen behind. This forces product developers to choose between rapid release times and security. In his session at DevOps Summit, Michael Murray, Director of Cyber Security Consulting and Assessment at GE Healthcare, examined the problems and presented some solutions for moving security into the DevOps lifecycle to ensure that we get fast AND ...
Mar. 5, 2015 06:00 AM EST Reads: 2,879
Red Hat has launched the Red Hat Cloud Innovation Practice, a new global team of experts that will assist companies with more quickly on-ramping to the cloud. They will do this by providing solutions and services such as validated designs with reference architectures and agile methodology consulting, training, and support. The Red Hat Cloud Innovation Practice is born out of the integration of technology and engineering expertise gained through the company’s 2014 acquisitions of leading Ceph s...
Mar. 5, 2015 04:45 AM EST Reads: 1,076
The Workspace-as-a-Service (WaaS) market will grow to $6.4B by 2018. In his session at 16th Cloud Expo, Seth Bostock, CEO of IndependenceIT, will begin by walking the audience through the evolution of Workspace as-a-Service, where it is now vs. where it going. To look beyond the desktop we must understand exactly what WaaS is, who the users are, and where it is going in the future. IT departments, ISVs and service providers must look to workflow and automation capabilities to adapt to growing ...
Mar. 5, 2015 04:00 AM EST Reads: 1,201
Docker is becoming very popular--we are seeing every major private and public cloud vendor racing to adopt it. It promises portability and interoperability, and is quickly becoming the currency of the Cloud. In his session at DevOps Summit, Bart Copeland, CEO of ActiveState, discussed why Docker is so important to the future of the cloud, but will also take a step back and show that Docker is actually only one piece of the puzzle. Copeland will outline the bigger picture of where Docker fits a...
Mar. 5, 2015 04:00 AM EST Reads: 3,319
Since 2008 and for the first time in history, more than half of humans live in urban areas, urging cities to become “smart.” Today, cities can leverage the wide availability of smartphones combined with new technologies such as Beacons or NFC to connect their urban furniture and environment to create citizen-first services that improve transportation, way-finding and information delivery. In her session at @ThingsExpo, Laetitia Gazel-Anthoine, CEO of Connecthings, will focus on successful use c...
Mar. 5, 2015 04:00 AM EST Reads: 3,051
Software Defined Storage provides many benefits for customers including agility, flexibility, faster adoption of new technology and cost effectiveness. However, for IT organizations it can be challenging and complex to build your Enterprise Grade Storage from software. In his session at Cloud Expo, Paul Turner, CMO at Cloudian, looked at the new Original Design Manufacturer (ODM) market and how it is changing the storage world. Now Software Defined Storage companies can build Enterprise grade ...
Mar. 5, 2015 03:30 AM EST Reads: 2,830
The Internet of Things (IoT) promises to evolve the way the world does business; however, understanding how to apply it to your company can be a mystery. Most people struggle with understanding the potential business uses or tend to get caught up in the technology, resulting in solutions that fail to meet even minimum business goals. In his session at @ThingsExpo, Jesse Shiah, CEO / President / Co-Founder of AgilePoint Inc., showed what is needed to leverage the IoT to transform your business. ...
Mar. 5, 2015 02:45 AM EST Reads: 4,011
Hadoop as a Service (as offered by handful of niche vendors now) is a cloud computing solution that makes medium and large-scale data processing accessible, easy, fast and inexpensive. In his session at Big Data Expo, Kumar Ramamurthy, Vice President and Chief Technologist, EIM & Big Data, at Virtusa, will discuss how this is achieved by eliminating the operational challenges of running Hadoop, so one can focus on business growth. The fragmented Hadoop distribution world and various PaaS soluti...
Mar. 5, 2015 02:30 AM EST Reads: 1,296
IoT is still a vague buzzword for many people. In his session at @ThingsExpo, Mike Kavis, Vice President & Principal Cloud Architect at Cloud Technology Partners, discussed the business value of IoT that goes far beyond the general public's perception that IoT is all about wearables and home consumer services. He also discussed how IoT is perceived by investors and how venture capitalist access this space. Other topics discussed were barriers to success, what is new, what is old, and what th...
Mar. 5, 2015 02:30 AM EST Reads: 4,616
Advanced Persistent Threats (APTs) are increasing at an unprecedented rate. The threat landscape of today is drastically different than just a few years ago. Attacks are much more organized and sophisticated. They are harder to detect and even harder to anticipate. In the foreseeable future it's going to get a whole lot harder. Everything you know today will change. Keeping up with this changing landscape is already a daunting task. Your organization needs to use the latest tools, methods and ex...
Mar. 5, 2015 01:30 AM EST Reads: 3,724
In his session at DevOps Summit, Tapabrata Pal, Director of Enterprise Architecture at Capital One, will tell a story about how Capital One has embraced Agile and DevOps Security practices across the Enterprise – driven by Enterprise Architecture; bringing in Development, Operations and Information Security organizations together. Capital Ones DevOpsSec practice is based upon three "pillars" – Shift-Left, Automate Everything, Dashboard Everything. Within about three years, from 100% waterfall, C...
Mar. 5, 2015 01:00 AM EST Reads: 4,561
Disruptive macro trends in technology are impacting and dramatically changing the "art of the possible" relative to supply chain management practices through the innovative use of IoT, cloud, machine learning and Big Data to enable connected ecosystems of engagement. Enterprise informatics can now move beyond point solutions that merely monitor the past and implement integrated enterprise fabrics that enable end-to-end supply chain visibility to improve customer service delivery and optimize sup...
Mar. 5, 2015 12:30 AM EST Reads: 3,663
Dale Kim is the Director of Industry Solutions at MapR. His background includes a variety of technical and management roles at information technology companies. While his experience includes work with relational databases, much of his career pertains to non-relational data in the areas of search, content management, and NoSQL, and includes senior roles in technical marketing, sales engineering, and support engineering. Dale holds an MBA from Santa Clara University, and a BA in Computer Science f...
Mar. 5, 2015 12:15 AM EST Reads: 3,830
It’s been proven time and time again that in tech, diversity drives greater innovation, better team productivity and greater profits and market share. So what can we do in our DevOps teams to embrace diversity and help transform the culture of development and operations into a true “DevOps” team? In her session at DevOps Summit, Stefana Muller, Director, Product Management – Continuous Delivery at CA Technologies, will answer that question citing examples, showing how to create opportunities f...
Mar. 4, 2015 10:00 PM EST Reads: 1,035
Even as cloud and managed services grow increasingly central to business strategy and performance, challenges remain. The biggest sticking point for companies seeking to capitalize on the cloud is data security. Keeping data safe is an issue in any computing environment, and it has been a focus since the earliest days of the cloud revolution. Understandably so: a lot can go wrong when you allow valuable information to live outside the firewall. Recent revelations about government snooping, along...
Mar. 4, 2015 09:00 PM EST Reads: 7,114