When you first heard the term “Internet of Things,” what were you picturing in your mind? What was your interpretation of what the IoT (Internet of Things) meant? For many, it was consumer-based, thinking of fitness trackers, smart thermostats, and connected doors that “tell you” when they are opened. For others, it meant scenarios in the commercial world such as sensors on jet engines that communicate the state of the engine for analysis or commercial fleets that could be tracked and managed with a high level of insight as to where the vehicles are and their current operating conditions. It may have meant “smart grids” that conveyed the status of electricity distribution to provide more efficient operation. One thing seemed clear, this was all very futuristic, and certainly very cool.

As enthusiasm for the IoT began to grow through 2013 and 2014, most of the focus was on the actual smart-connected products themselves. This was a key part of a widely read and highly regarded article published in November 2014, by Jim Heppelmann from PTC and Michael Porter from Harvard, in the Harvard Business Review entitled “How Smart, Connected Products Are Transforming Competition.” In it, they charted the market progression of IoT from “products,” to “smart products,” to  “smart connected products,” to “product systems,” to “systems of systems.” They articulated this likely progression extremely well. That said, the focus at the time, and in many ways since then, has been on smart-connected products, or said differently, “IoT-enabled products.”

We are beginning to see the move toward “product systems,” especially in certain markets like smart homes, where alliances are being formed and standards are consolidating to allow for logically aligned products to interact. So, for example, the arrival of your car within five miles of home should “tell” your house to turn on the lights, adjust the temperature, and when you get 400 yards from home, open the garage door and unlock the house.

This is all good. But the focus that has been on “IoT-enabled products,” as well as the early signs of the more integrated product systems is also mainly focused on the product capabilities for alerting and triggering, and the individual product providers ability to provide predictive maintenance. Again, these are all good capabilities and they are advancing these product capabilities in many ways with compelling results. But in order to progress from “smart-connected products” to a “system of systems,” the contemplation of the role of data and the leverage gained from the data becomes critical.

It is also a much more difficult progression because the issues are not just technology, but contractual as well. Who owns the data, who controls the data, who can see the data, and who can use the data all become a key part of the narrative. The challenge is to go from being an organization that used “IoT-enabled products” to becoming an “IoT-enabled organization.” This can be accommodated with a “First Receiver” architecture.

Think of it as an edge device or service that persists, enriches, and cleanses the data near the point of ingestion. The goal is to separate the creation of the data from the consumption of the data. This way, the product providers can still get their data, but the local operation of the enterprise can as well, along with regional offices or corporate headquarters, as well as third-party supply-chain partners and perhaps even regulatory oversight bodies. The key is the data governance that determines which constituent is allowed to see what data and how.

This concept has been around for decades. The system “R” team from IBM realized this when pioneering relational databases in 1972, with the central thesis being to leverage the utility value of data. That same temperature reading on the low fat fryer might end up in 10 different locations and represented in 10 different ways depending on the context of the given constituent.

The idea of the First Receiver relies on nothing new. The principal tenants of relational data, the ability to apply data governance, the utilization of device drivers, and event-driven, publish, and subscribe architectures have all been tested and retested over time. But the idea of applying these to IoT messages created by billions of sensors deployed around the globe, then persisting that data and managing how that data is then cleansed, enriched, and ultimately propagated to the right constituent in the right location in the right format at the right time is most likely that path that is truly leveraging the utility value of the data.

In doing so, the operational analytics are much better, as you can monitor “what is going on” with greater insight and granularity. The investigative analytics are much better. They allow you to understand “why is it going on?” The data mining and forensic analysis over a richer data set can detect higher quality digital signatures and produce greater insight. The predictive analytics are much better. They allow you to understand “what will be going on” because the predictive models can incorporate (and test against) a broader set of dimensions and explore both linear and non-linear correlations in order to increase the precision of the predictions.

These cleansed and enriched datasets should also mean advanced machine learning and artificial intelligence that can ultimately deliver truly adaptive systems. This is the world we are approaching. The implications are enormous, but this progression relies heavily on gaining maximum leverage from the data. Given the enormous focus the market has seen on the “IoT-enabled products,” the human-side of getting to the “IoT-enabled enterprise” may not be quite so easy. Critical, and perhaps ultimately inevitable, but not easy.

 

About the Author

Don DeLoach is an entrepreneur, board member, Internet of Things evangelist, and co-author of The Future of IoT. He also serves as co-chairman of the Midwest IoT Council. Previously, he spent 18 years as CEO of three companies.