Digital Transformation
cancel
Showing results for 
Search instead for 
Did you mean: 

An IoT evolution path

mikeshaw747

I’m not normally a fan of evolution paths or maturity curves because adoptions of something new don’t always follow a linear path. That said, there do seem to be different levels in the adoption of IoT, so I thought I’d at least lay down some kind of hierearchy.

This evolution is very loosely based upon McKinsey’s machine learning evolution path. The two probably fit well together because, as we’ll see, getting in tons of IoT data is not really an end in itself - it’s what you do with all the data that matters.

Let’s look, then, at the different levels of an IoT evolution..


Level 1 : Discover or sense

This is where we get in data that tells us what’s going on. In the case of IoT, this might be the input from a pressure sensor.

Of course, one of the most popular “IoT sensors” in the future is going to be video. 

Some people call this level of the evolution, “sense”. 

You can take the input from the discover/sense step and use this without having any more components of an IoT solution. For example, you could have a air quality sensor added to your existing production facility. Or, my car has an oil quality sensor at the bottom of the sump. There is no fixed service interval - the car estimates when I’ll need a service based upon its readings (this is very primitive prediction - more on prediction below)


Level 2 : Infer what’s going on

If the sensor is a direct measurement - pressure, air quality, oil quality, then we may think we already know what’s going on.  Or, we may need to combine the readings from a number of sensors to infer what’s going on. 

However, if the our sensor is a video stream, then there is plenty of “infer” to do. Depending on the situation, the video inference will be very different. For example, the inferring needed for a self-driving vehicle is very different to that required, say, when doing precision farming thru video feeds of the crops in a field. 

But because video will be such a popular sensor in the future, we are going to see a lot of video inference solutions.


Level 3 : Act

This is where it gets a little fuzzy. We could take the output from the infer step and use it to take action. If, for example, we infer that an old person has fallen over, we could alert a carer. Or, if we detect that my oil’s viscosity is out of range, we could arrange for me to take the car in for a service. 

When we act, we are probably initiating a business process. In case of a fall in the home, it’s a “check patient for fall” process. In the case of my oil, it’s a “get car in for service” process.


Level 3.1 : Integrate into existing business processes

Many business processes are, of course, already controlled by IT systems - typically ERP, service desk or CRM systems.

It’s estimated that, today, only 23% of IoT projects are connected to Enterprise IT systems. In other words, the majority of the IoT-based “acts” today are working in isolation. 

Some people are alarmed by this. I think it’s fine - it’s early days for IoT, and it’s probably best to get the sense / infer / act chain working in isolation before connecting it to existing ERP, CRM or service desk systems. 


Level 4 Predict

All of our IoT evolution levels until now have been “nothing new”. When I was a University student thirty-something years ago, I programmed control systems during all my vacations. I took inputs from sensors, did some inferring and took some action. Admittedly, video wasn’t a sensor option in those days, however.

Why, then, all the excitement about IoT. Video has been added as an IoT sense option. And the sensors have become much better and have fallen massively in price. And, comms is now way better (there were no mobile phones, no 4G, no wifi thirty years ago). 

But the great promise of IoT is when the “infer what’s going on” data is linked to machine learning. Once we do this, we can start to predict. 

We can use past data, we can use a number of data sources, and we can create systems that take action not upon immediate “ahhhhh - you need to act” data, but upon predicting what is going to happen. 

The most well documented prediction use case when it comes to IoT is probably predictive maintenance. Sensor data is fed into a machine learning system which uses past input and outcome data to predict when failure will occur. This then allows us to fix problems when their downtime impact causes us minimal impact. 

We can also use prediction to maximize asset utilization in production. And we can use it to minimize wastage in our supply chains. 

So, rather than use IoT to tell us our sewage pump is broken, we can proactively fix it, probably meaning there is no downtime. My car doesn’t get a service every 20,000 miles - it gets a service exactly at the point when the oil falls below the manufacturer’s tolerances (which is, amazingly, at 34,000 miles - a 70% saving in servicing costs for me). 

I believe that the use of machine learning is a huge, huge deal. Machine learning is a type of artificial intelligence (as is vision “inferring”). AI is set to grow by 40% every year for many years to come. And machine learning is going to be about 60% of the total AI market, in no small way, I believe, because of its use in prediction in the world of IoT.


Level 5 : Insight

“I’ve noticed that, after lots of rainfall when the average temperature is above 65 degrees, you get lots of slugs in the lower quadrant of field number 6”.  

The next evolution level, and highest in McKinsey’s machine learning model, is “insight”. This is where the artificial intelligence system finds interrelationships from the data it is fed. AI practitioners talk about how, with AI, all data is valuable - you mustn’t pre-decide what data the AI system can see.  And the reason they do this is because in order for the AI system to provide these insights, it must have access to lots of data from lots of sources.  

In my example above, we linked weather data with slug population data. Production data, customer data and problem data might be other sources for insight. Even human interaction data like twitter feeds might help provide insights.

When happens to these insights? Few customers are at the “insights” stage with machine learning, but those that are feed any possible insights to humans who then work with the machine learning systems to make changes to systems based upon those insights. I don’t have an example from IoT, but McKinsey sites a credit card company who machine learning system said, “I believe I know the type of customer and situations under which credit card customers will default on their payments”. The bank (the humans in the bank) then designed a prediction system that took the “digital footprint” of customers and gave them early warning of payment defaulting. 

In other words, data, including IoT data creates insights and these insights are then used to re-design business systems.

Which brings us nicely to another “evolution” of IoT which our HPE high-performance people are very keen on. Once again, the “evolution” has broken down because what I’m about to talk about is not really after level 5 : insights. Never mind …


Models, digital twins and simuations

When I first heard about “digital twins” I found the concept a little unnerving. Would someone, somewhere, have a “digital twin” of me? (The answer, of course, is yes. As they say, “Google knows more about you than you do” - Google already has a digital twin of me)

In a world of digitization, we can use digital data to build up a digital model of the things from which we get data. Typically, we start with a postulation model of the “thing” - a production line, a type of cancer, etc. 

And we then use digital data to update this model - to get this model as close as possible as we can to reality. And we continue to update our model over time. In fact, one of the cool aspects of machine learning is that it takes data from ALL the “things” and uses this to update the model. So, if we are trying to model a type of cancer we can get data from everyone working with this cancer type - not just one research lab. This means that our model will get updated at a very fast rate - at a rate that is much fast than that that humans can achieve. 

Once we have our model we can then do simulation on that model. So a/ we postulate a model, b/ we update the model with data (including IoT data) from lots of instances of the real thing and c/ we can then do simulations upon that model. These models are typically referred to as “digital twins”. 



Mike Shaw
Director Strategic Marketing
Hewlett Packard Enterprise

twitter.gif @mike_j_shaw
linkedin.gif Mike Shaw



Mike Shaw
Director Strategic Marketing

twitter.gif@mike_j_shaw
linkedin.gifMike Shaw

0 Kudos
About the Author

mikeshaw747

Mike has been with HPE for 30 years. Half of that time was in research and development, mainly as an architect. The other 15 years has been spent in product management, product marketing, and now, strategic marketing. .

Labels
Events
28-30 November
Madrid, Spain
Discover 2017 Madrid
Join us for Hewlett Packard Enterprise Discover 2017 Madrid, taking place 28-30 November at the Feria de Madrid Convention Center
Read more
HPE at Worldwide IT Conferences and Events -  2017
Learn about IT conferences and events  where Hewlett Packard Enterprise has a presence
Read more
View all