Predictive Maintenance Throwdown: Part II
By Isaac Brown
Somehow IoT & AI have helped to make equipment maintenance an incredibly sexy topic in the realm of digital innovation – who would have thought? Predictive maintenance (“PdM”) is now on every digital agenda for equipment operators, manufacturers, servicers, and pure-play analytics vendors. In this article, we’ll expand upon the themes of a previous article with a focus on vendor differentiation and various deployment models.
We had previously considered an example where an injection molding machine manufacturer was remotely monitoring its machines in the field. By collecting and analyzing machine condition data for 5 years across a fleet of 50,000 machines, this manufacturer had developed high-precision models that could predict machine failure. A vendor like this – that has massive datasets from monitoring across a fleet of a single machine type – can build better, more predictive models (since a single machine type will exhibit a more finite set of issues). Hence many PdM vendors focus on specific types of assets.
Meanwhile, some vendors claim to have developed broad, generic failure models for many of the essential components used in industrial operations – including bearings, motors, pumps, valves, transformers, gearboxes, etc. Complex industrial systems are largely just a combination of these essential components, so if you can model the constituent components adequately, you can approach a wider range of complex systems with confidence.
Some vendors bring pre-existing machine failure models to new customers, while others train their models on historical customer data. Many industrial operators have been capturing operational data for years – and these datasets often contain machine condition data, along with resulting failure data. Modelling these historical datasets enables operators to develop an effective foundation on which to build future analytics programs.
Years ago, I advised a major oil & gas company on drill rig reliability. This company had been capturing data in OSIsoft PI historian systems for over 20 years across the fleet, but the data was just sitting in these boxes around the world… doing literally nothing, waiting to be queried after something exploded. Our job was to help this company find a partner that could ingest the PI data and build predictive failure models. This historian data became the foundation for the company’s predictive maintenance program, it turned out to be an extremely valuable asset.
There are a ton of factors to consider when developing a PdM strategy including historical datasets, data architecture, and data ownership.
Beyond considering the existing datasets and modelling techniques, equipment providers and PdM vendors need to consider multiple delivery approaches when they target different customer segments. Many operators will not allow any data to leave the walls of their operating environments. For example, I’ve spoken with precisely zero semiconductor manufacturers that are OK with data leaving the fabs. For these types of customers, vendors need to offer an on-premise solution. On-premise solutions are inherently weaker since they don’t grab data from across the fleet, and they can’t harness the horsepower of cloud storage & compute.
Assuming an operator is OK with data leaving the facilities, there are still several considerations: Is the data going to the operator’s internal private cloud, behind a firewall? Is it going into a public cloud implementation managed by the operator, or is it going to a public cloud implementation managed by the vendor? Or is there a hybrid architecture where different bits of data are shared into different data repositories? PdM vendors must address a wide range of data architectures in order to capture the largest addressable market.
Finally, assuming operator customers are comfortable with the cloud, there are still additional considerations: Some customers DO NOT want their data to be combined with data from other customers (despite the obvious advantages of improved modelling). Even if the data can be combined, some customers DO NOT want the abstracted data models to be shared with other customers, ideally to keep their data from helping their competitors. The ideal customers are happy to have their data combined and modelled into a broader dataset across many operator customers, in an “everyone wins” scenario – this is where all the customers benefit from the value of a broader dataset across many machines in many environments.
To summarize, there are a ton of factors to consider when developing a PdM strategy including historical datasets, data architecture, and data ownership. Decision-making will get easier as the market matures, but in 2020, these are all important decision-making criteria.
Read More About Landmark
in the News.