Click here to close now.




















Welcome!

@CloudExpo Authors: Trevor Parsons, Ian Khan, Patrick Hubbard, Elizabeth White, Pete Waterhouse

Related Topics: Agile Computing, Java IoT, Microsoft Cloud, Linux Containers, @CloudExpo, @BigDataExpo, @ThingsExpo

Agile Computing: Article

The Rise of Things, Internet of Things

The screens in our lives will slowly start to take a back seat to a model of computing that operates off the context we generate

End-user computing devices have followed a trajectory of faster, smaller, and cheaper for several decades: adding better connectivity, more natural interfaces, but largely remaining a device with a screen and human input device. This model is breaking down as computation and connectivity collide with ordinary real-world things. These things often have existing physical methods of interacting with them that we culturally don't want to change or no interface at all.

I've been involved with devices for much of my professional career, starting with television set top boxes at Microsoft for the better part of a decade, then working in mobile as part of the Android team at Google, and most recently in the Internet of Things at Nest Labs before rejoining Microsoft as part of our platform strategy team. In my current role, one of my focus areas has been to think about so called Internet of Things and what that means for the industry, for Microsoft, and for enterprises and consumers.

It's clear to me that the future of computing lies in these things. The screens in our lives will slowly start to take a back seat to a model of computing that operates off of the context that we generate. In this sense, computing will take a much more active role in our lives but at the same time much more invisible. That said there are substantial challenges in getting from where we are today to this future, and I thought I'd survey those problems and potential solutions.


Connectivity

In the broader Internet, we've started to think about connectivity as a given. The pervasiveness of networks and the consolidation of the industry around cellular standards like LTE and wireless standards like 802.11 mean that, for our computing devices, we are almost always connected and the design of applications has shifted from primarily offline to primarily online to match this.

One of the key challenges in the Internet of Things is that it doesn't fit cleanly into this. The existing set of wireless and cellular standards are wholly unsuited for long longevity battery use - they are designed for devices, like our computer or phone, that are always or frequently connected to a power plug.

A door lock is a good example of a real-world device. It isn't connected to a power plug. While one solution could be to change or charge the batteries in your door lock once a month so that it can use Wi-Fi, when you step back and realize that there are hundreds of these devices in the home, it's clear that this would quickly limit our desire to manage more than a handful of these in our houses.

Rethinking then how we connect these devices is one of the key challenges facing the industry. There are a number of efforts to solve this, including new protocols like Zigbee, but the most promising of these are the efforts to create highly efficient variants of existing protocols like 802.11 with 802.11ah or Bluetooth with Bluetooth Low Energy (now branded Bluetooth Smart). These technologies hold the promise to overcome rapid power consumption in these devices.

In many ways, Bluetooth Smart is already here. As part of the Bluetooth 4.0 spec, it has piggybacked its way into many of the latest Bluetooth chipsets and from a software platform perspective (Windows 8, iOS, and Android platforms all include support for it). Given this, it is starting to become prevalent with the latest wave of devices. It also promises multi-year battery life levels of efficiency and provides an abstraction mechanism for exposing data and control through its characteristics and services. I wouldn't be surprised to see Bluetooth Smart move front and center in 2014 as it gains critical mass as a key way of bridging to these real-world things.

Applications
The simplicity of these devices implies that what it means to be an application will also change. In this world, applications shift from being something with a user interface that runs on our devices and backed by the cloud to a model where an application analyzes the context provided by potentially a large number of these devices. The application will begin to present itself less on a screen and more in the state changes in the real world. These applications will not run on any one of these devices but between them.

Message Based
One potential model for this that we are experimenting with at Microsoft is a messaging-based approach. You can conceptually think about this as "Twitter for devices" where devices and applications communicate using messages through a message broker. The schema for these messages is well known among the principals in the system, enabling applications and devices to communicate that otherwise have no knowledge of each other.

This is a key advantage because devices in this new world are shifting from being consumption and creation devices to devices that provide context and control. A messaging-based approach allows you to leverage the message stream from one of these devices for multiple applications without correspondingly taxing this device with multiple requests for state. For example, a proximity sensor in your office hallway provides very interesting context for a security application for the building but is equally interesting to an application that uses them to make dynamic climate control decisions. A messaging model enables this with one set of state. It also provides a clean archiving and auditing model, enabling you to look back over this data two years later, for instance, when you want to build an occupancy model for your building across all of its proximity sensors.

Management
The quantity and sensitivity of these devices will also mean that we need to rethink how we manage them and their data streams.

We currently manage an increasingly large number of computing devices in our lives, and while application stores have made it easier for us to install and upgrade applications and operating systems, we still spend a significant amount of time managing our devices.

As we increase the number of devices by an order of magnitude, we won't be able to provide this same level of love and care for every device in our lives. These devices are going to need to be largely autonomous. One of the core challenges of Internet of Things will be building the infrastructure to enable this level of autonomy.

Highly Distributed
Our current conception of devices working with services is largely a two-tier model. For many applications that require precise control, the 200ms latency involved in doing a round-trip from a home in Oklahoma to a data center in Virginia where multiple devices' message streams are combined may be too much. This means that applications that require this level of low latency will need to execute much closer to the edge. That said, there are many applications that will require the computational capacity and flexibility that only a larger public or private cloud data center can provide. One of the key challenges we face is providing a single abstraction for developers such that both these classes of application use the same interfaces and the infrastructure is smart enough to satisfy them transparently.

The data streams involved in the Internet of Things are also typically highly sensitive, either in the context that they provide on us or the sensitivity of the equipment that they control. One of the things we must demand as individuals and enterprises is control on what set of data we send to a centralized public cloud versus retain within systems under our control.

I believe these factors will drive a distributed approach to the Internet of Things, where applications move to the data instead the current direction of all of our data moving to the applications in the public cloud. At Microsoft we are currently experimenting with this hybrid approach, where there are several hierarchical tiers of increasing computation and storage as you go toward the cloud. Applications and data in this model flow between these tiers to the appropriate level that balances computational, latency, and privacy concerns. This distributed approach is also another key reason that an immutable messaging-based approach makes sense - it enables you to replicate these message streams between these tiers in the system while applying permission-based controls to filter them down to the messages you are comfortable sharing with another application or computational tier.

Big Data
One thing that is clear is that the volume of data that is generated from these much more numerous devices will be staggering. For example, capturing all of the data from a single car's lifetime in an enterprise fleet requires upwards of 100GB on a relatively spacious once-a-second resolution. For an enterprise like Avis, which has on the order of 150,000 cars, this means managing nearly 15PB of information over the lifetime of one generation of cars.

As an industry we have established batch algorithms and platforms like map/reduce and Hadoop and newer near real-time platforms like Storm to process these large streams of information - but these still require substantial data science and DevOps investments to operate, which put them out of the reach of smaller organizations. A key challenge is making it easier to run data pipelines that operate on the context these devices generate and building abstractions that make them easier to develop for and to use with existing information worker tools.

First Steps
We are at the very beginning of this transformation and are all still trying to get our heads around the right model that solves the problems in this space. Although I've posed a number of potential solutions in this post, you should take these more as strawmen to start a discussion than any concrete recommendation. I'd love to talk with you if working on any problems in this space - feel free to reach out to me at [email protected] or @timpark on Twitter.

More Stories By Tim Park

Tim Park is Director, Platform Strategy at Microsoft. He helps to set the direction of Microsoft platforms internally with a focus on the startup and open source communities. He has over 15 years of application development experience across client and server from work at Microsoft and two startups (WebTV Networks and Nest Labs).

As global advocate for the startup and open source community within Microsoft, Park evangelizes the product needs of startups and open source communities within Microsoft across client and cloud and help these communities understand what Microsoft has to offer them in terms of platforms, programs, and partnerships

Comments (0)

Share your thoughts on this story.

Add your comment
You must be signed in to add a comment. Sign-in | Register

In accordance with our Comment Policy, we encourage comments that are on topic, relevant and to-the-point. We will remove comments that include profanity, personal attacks, racial slurs, threats of violence, or other inappropriate material that violates our Terms and Conditions, and will block users who make repeated violations. We ask all readers to expect diversity of opinion and to treat one another with dignity and respect.


@CloudExpo Stories
In today's digital world, change is the one constant. Disruptive innovations like cloud, mobility, social media, and the Internet of Things have reshaped the market and set new standards in customer expectations. To remain competitive, businesses must tap the potential of emerging technologies and markets through the rapid release of new products and services. However, the rigid and siloed structures of traditional IT platforms and processes are slowing them down – resulting in lengthy delivery ...
U.S. companies are desperately trying to recruit and hire skilled software engineers and developers, but there is simply not enough quality talent to go around. Tiempo Development is a nearshore software development company. Our headquarters are in AZ, but we are a pioneer and leader in outsourcing to Mexico, based on our three software development centers there. We have a proven process and we are experts at providing our customers with powerful solutions. We transform ideas into reality.
Any Ops team trying to support a company in today’s cloud-connected world knows that a new way of thinking is required – one just as dramatic than the shift from Ops to DevOps. The diversity of modern operations requires teams to focus their impact on breadth vs. depth. In his session at DevOps Summit, Adam Serediuk, Director of Operations at xMatters, Inc., will discuss the strategic requirements of evolving from Ops to DevOps, and why modern Operations has begun leveraging the “NoOps” approa...
Organizations from small to large are increasingly adopting cloud solutions to deliver essential business services at a much lower cost. According to cyber security experts, the frequency and severity of cyber-attacks are on the rise, causing alarm to businesses and customers across a variety of industries. To defend against exploits like these, a company must adopt a comprehensive security defense strategy that is designed for their business. In 2015, organizations such as United Airlines, Sony...
SYS-CON Events announced today that HPM Networks will exhibit at the 17th International Cloud Expo®, which will take place on November 3–5, 2015, at the Santa Clara Convention Center in Santa Clara, CA. For 20 years, HPM Networks has been integrating technology solutions that solve complex business challenges. HPM Networks has designed solutions for both SMB and enterprise customers throughout the San Francisco Bay Area.
Red Hat is investing in Tesora, the number one contributor to OpenStack Trove Database as a Service (DBaaS) also ranked among the top 20 companies contributing to OpenStack overall. Tesora, the company bringing OpenStack Trove Database as a Service (DBaaS) to the enterprise, has announced that Red Hat and others have invested in the company as a part of Tesora's latest funding round. The funding agreement expands on the ongoing collaboration between Tesora and Red Hat, which dates back to Febr...
As more and more data is generated from a variety of connected devices, the need to get insights from this data and predict future behavior and trends is increasingly essential for businesses. Real-time stream processing is needed in a variety of different industries such as Manufacturing, Oil and Gas, Automobile, Finance, Online Retail, Smart Grids, and Healthcare. Azure Stream Analytics is a fully managed distributed stream computation service that provides low latency, scalable processing of ...
Too often with compelling new technologies market participants become overly enamored with that attractiveness of the technology and neglect underlying business drivers. This tendency, what some call the “newest shiny object syndrome,” is understandable given that virtually all of us are heavily engaged in technology. But it is also mistaken. Without concrete business cases driving its deployment, IoT, like many other technologies before it, will fade into obscurity.
With the proliferation of connected devices underpinning new Internet of Things systems, Brandon Schulz, Director of Luxoft IoT – Retail, will be looking at the transformation of the retail customer experience in brick and mortar stores in his session at @ThingsExpo. Questions he will address include: Will beacons drop to the wayside like QR codes, or be a proximity-based profit driver? How will the customer experience change in stores of all types when everything can be instrumented and a...
Cloud and datacenter migration innovator AppZero has joined the Microsoft Enterprise Cloud Alliance Program. AppZero is a fast, flexible way to move Windows Server applications from any source machine – physical or virtual – to any destination server, in any cloud or datacenter, using its patented container technology. AppZero’s container is also called a Virtual Application Appliance (VAA). To facilitate Microsoft Azure onboarding, AppZero has two purpose-built offerings: AppZero SP for Azure,...
WSM International, the pioneer and leader in server migration services, has announced an agreement with WHOA.com, a leader in providing secure public, private and hybrid cloud computing services. Under terms of the agreement, WSM will provide migration services to WHOA.com customers to relocate some or all of their applications, digital assets, and other computing workloads to WHOA.com enterprise-class, secure cloud infrastructure. The migration services include detailed evaluation and planning...
SYS-CON Events announced today that G2G3 will exhibit at SYS-CON's @DevOpsSummit Silicon Valley, which will take place on November 3–5, 2015, at the Santa Clara Convention Center in Santa Clara, CA. Based on a collective appreciation for user experience, design, and technology, G2G3 is uniquely qualified and motivated to redefine how organizations and people engage in an increasingly digital world.
SYS-CON Events announced today that Micron Technology, Inc., a global leader in advanced semiconductor systems, will exhibit at the 17th International Cloud Expo®, which will take place on November 3–5, 2015, at the Santa Clara Convention Center in Santa Clara, CA. Micron’s broad portfolio of high-performance memory technologies – including DRAM, NAND and NOR Flash – is the basis for solid state drives, modules, multichip packages and other system solutions. Backed by more than 35 years of tech...
This Enterprise Strategy Group lab validation report of the NEC Express5800/R320 server with Intel® Xeon® processor presents the benefits of 99.999% uptime NEC fault-tolerant servers that lower overall virtualized server total cost of ownership. This report also includes survey data on the significant costs associated with system outages impacting enterprise and web applications. Click Here to Download Report Now!
SYS-CON Events announced today the Containers & Microservices Bootcamp, being held November 3-4, 2015, in conjunction with 17th Cloud Expo, @ThingsExpo, and @DevOpsSummit at the Santa Clara Convention Center in Santa Clara, CA. This is your chance to get started with the latest technology in the industry. Combined with real-world scenarios and use cases, the Containers and Microservices Bootcamp, led by Janakiram MSV, a Microsoft Regional Director, will include presentations as well as hands-on...
SYS-CON Events announced today that Pythian, a global IT services company specializing in helping companies leverage disruptive technologies to optimize revenue-generating systems, has been named “Bronze Sponsor” of SYS-CON's 17th Cloud Expo, which will take place on November 3–5, 2015, at the Santa Clara Convention Center in Santa Clara, CA. Founded in 1997, Pythian is a global IT services company that helps companies compete by adopting disruptive technologies such as cloud, Big Data, advance...
Culture is the most important ingredient of DevOps. The challenge for most organizations is defining and communicating a vision of beneficial DevOps culture for their organizations, and then facilitating the changes needed to achieve that. Often this comes down to an ability to provide true leadership. As a CIO, are your direct reports IT managers or are they IT leaders? The hard truth is that many IT managers have risen through the ranks based on their technical skills, not their leadership ab...
IBM’s Blue Box Cloud, powered by OpenStack, is now available in any of IBM’s globally integrated cloud data centers running SoftLayer infrastructure. Less than 90 days after its acquisition of Blue Box, IBM has integrated its Blue Box Cloud Dedicated private-cloud-as-a-service into its broader portfolio of OpenStack® based solutions. The announcement, made today at the OpenStack Silicon Valley event, further highlights IBM’s continued support to deliver OpenStack solutions across all cloud depl...
SYS-CON Events announced today that DataClear Inc. will exhibit at the 17th International Cloud Expo®, which will take place on November 3–5, 2015, at the Santa Clara Convention Center in Santa Clara, CA. The DataClear ‘BlackBox’ is the only solution that moves your PC, browsing and data out of the United States and away from prying (and spying) eyes. Its solution automatically builds you a clean, on-demand, virus free, new virtual cloud based PC outside of the United States, and wipes it clean...
Through WebRTC, audio and video communications are being embedded more easily than ever into applications, helping carriers, enterprises and independent software vendors deliver greater functionality to their end users. With today’s business world increasingly focused on outcomes, users’ growing calls for ease of use, and businesses craving smarter, tighter integration, what’s the next step in delivering a richer, more immersive experience? That richer, more fully integrated experience comes ab...