Preventative Maintenance, Garbage In, Garbage Out

| Environmental Testing

Monitoring big data drives effective preventative maintenance

The biggest threat to the industrial benefits of IoT, Big Data, Machine learning and A.I. 

Momentum is building behind a gathering wave of data, feeding ever more sophisticated algorithms, and driving real-time decisions. This data, through the information extracted via a burgeoning array of digital tools, is now identifying plant equipment failures 48 hours before they occur. This is driving immediate preventative actions and allowing for extended and uninterrupted periods of monitored and reliable operation.

This predictive maintenance model is fast becoming a viable solution to the enormous cost of routine maintenance. The impending plant failure flagged by a human interface layer, sitting atop a machine learning model, is entirely dependent on data, supplied by measurement devices.

With the improved security and scalability of the cloud and falling cost of local storage, big data is no longer the reserve of tech giants, and plant maintenance is just a single application with exciting prospects. The volume and velocity of the data that is collected and accessible within companies is ever increasing, and as algorithms become more powerful and further embedded in process, the application of their outputs will become ubiquitous. This creates a critical demand for greater data veracity.

‘Garbage in garbage out’ is a well-known computing axiom that warns us against the risks of being deceived by a combination of massive data quantities, robust models, and effectively limitless computing power. Quality is still king and the risk of ‘Garbage in Gospel out’ is an ever growing one.

In 2018, Siemens introduced ‘Industrial Anomaly detection’ in order to monitor growing networks of connected devices for spurious data. This data analysis, whereby a learning phase allows the program to spot malfunctioning or compromised devices, serves to protect the network at a high level. The persisting danger is that a device that is functioning normally in every respect, except in its accuracy of measurement, can be much harder to detect.

In applications including safety critical functions, and high-profile systems, redundancy and signal comparison can be used to detect this poor-quality data. Multi-sensor Bayesian data fusion coupled with Kalman filtering has been demonstrated to lower uncertainty and inconsistencies within data, however it is not always a cost effective or viable solution.

Actively managed and correctly calibrated sensors and measurement devices give credibility to the insights born of the algorithms they feed. Confidence in the measurement gives confidence in the system, and its conclusions. When drift and error inevitably creep in, not only are there negative implications for efficiency, quality, reliability, and ultimately financial results, but also inhibition of the advances that are now available through putting every piece of measurement data to work.

A well-developed calibration program for devices is critical to all modern quality and maintenance systems, but it is also the bedrock on which measurement data driven, A.I. powered insights and their potential to transform industry must be built.

Young Calibration Ltd, 5 Cecil Pashley Way, Shoreham by Sea, West Sussex, BN43 5FF
Web: www.youngcalibration.co.uk

Related news

Leave a Reply

Your email address will not be published. Required fields are marked *