Why manufacturers are being left in the dark when it comes to data


By Ruban Phukan
Friday, 26 October, 2018


Why manufacturers are being left in the dark when it comes to data

The Fourth Industrial Revolution has been characterised by demands of perfection — manufacturers are expected to produce the perfect product in the most effective way possible when it comes to time and resources. But what happens when they run into unexpected obstacles such as interruptions or malfunctions?

Breakdowns in production lines can cost manufacturers anywhere from $50,000 to $2 million per hour.1 Unsurprisingly, manufacturers often can’t afford the expenses associated with unforeseen downtime — yet according to technology research company Vanson Bourne,2 82% of companies have experienced at least one unplanned outage over the past three years.

While manufacturers are investing heavily in data-led technologies such as the Industrial Internet of Things (IIoT), machine learning and artificial intelligence (AI), they are not always equipped with the necessary analytics skills to leverage data gathered from these technologies to their fullest potential. In fact, Capgemini3 found that this is the case for almost 60% of organisations in a recent study.

We can see a clear gap between the potential of technologies like the IIoT and the realisation of this potential. So what’s the problem?

A stab in the dark

‘Dark data’ is a common but little-acknowledged problem experienced by most manufacturers. It occurs when a company is generating information but is unable to use it in a meaningful way.

Often dark data rears its head when data is being created by machines but is not visible and as a result not used for making decisions. More frequently, disconnect occurs where there are no adequate storage facilities to retain data long enough to process it or where huge volumes of data can’t be scaled by data science teams. Alternatively, companies are unable to hire enough data scientists to process all the gathered information so they end up working with just a limited sample. In addition, many predictive maintenance systems end up sending alerts for too many anomalies (false positives) or not enough (false negatives), and manufacturers are paying the price.

Getting predictive maintenance right can have a real impact on the bottom line. To give a practical example, more than a third of manufacturers lose 1–2% their annual sales4 to scrap and rework. This loss could be avoided by putting effective systems in place to identify issues before the quality check stage and, ultimately, save valuable resources.

The light at the end of the tunnel

The good news is there has been a shift in industry thinking, and some manufacturers are now implementing effective ways of storing and processing data.

The IIoT and AI are among a slew of technologies detecting early signals of future problems and helping manufacturers take proactive actions to prevent them. Automating the process of analysing a growing number of datasets is key to mitigating the risk of dark data and predicting machine health accurately.

Applying a cognitive approach to predictive maintenance is a good way to kick off this process. While a manual approach to predictive maintenance is useful to identify common issues occurring across all machines, it can only use known problems from past experiences and assumes that only outliers are anomalies.

By implementing a cognitive, ‘machine-first’ approach to anomaly detection, manufacturers can create a mechanism where the algorithms can adapt to changing conditions and learn the data domain for each individual machine. This knowledge is then transferred across similar machines and validated through feedback from subject matter experts.

In layman’s terms — a cognitive approach will eventually create a fully automated and cognitively enabled machine learning system, which can predict anomalies before they occur. Imagine how many resources could be saved if manufacturers were alerted about potential downtime and could fix the issue to prevent costly interruptions.

The next frontier

Even in this age of digitisation, manufacturers are still left in the dark when it comes to knowing when equipment is due for maintenance, upgrade or replacement.

Investing in data-led technologies and taking a cognitive approach can help build a rock-solid foundation for accurate anomaly detection scenarios, and enable truly efficient predictive maintenance strategies. Using these technologies to solve the dark data problem unveils a competitive advantage to any organisation brave enough to take the plunge.

References

  1. Vanson Bourne Ltd 2017, ‘After the Fall: Cost, Causes and Consequences of Unplanned Downtime’, published on behalf of ServiceMax, <https://lp.servicemax.com/Vanson-Bourne-Whitepaper-Unplanned-Downtime-LP.html>
  2. Ibid.
  3. Subrahmanyam KVJ 2018, ‘Unlocking the Internet of Things: Why scale is the smart route to high value’, Capgemini, <https://www.capgemini.com/2018/04/unlocking-the-internet-of-things-why-scale-is-the-smart-route-to-high-value/>
  4. Vanson Bourne Ltd op cit.

Ruban Phutan is the co-founder and Chief Product & Analytics Officer at DataRPM (acquired by Progress) where he leads product and data science for the flagship Cognitive Predictive Maintenance product, which solves the complex business problems of minimising asset failures and unplanned downtime while maximising efficiency in Industrial IoT.

Image credit: ©stock.adobe.com/au/Olivier Le Moal

Related Articles

The cyber-physical manufacturing journey

It is time for manufacturers to start their own digitalisation journey and ride the wave of the...

Securing the smart factory: cybersecurity for advanced manufacturing

Threats to industrial operations have outpaced the capabilities of most OT cybersecurity...

AI in engineering: no immediate solutions for specific projects

Will AI ever replace the imaginative and creative engineering professional? Maybe, but not yet.


  • All content Copyright © 2024 Westwick-Farrow Pty Ltd