Seeking self-adjusting instruments
Switch on any modern projector today and the keystone correction automatically squares the image on the screen. Modern process instruments incorporate a fair degree of intelligence used to optimise process variable accuracy and reliability, and can even undertake predictive maintenance. But they fail miserably in two critical aspects even when compared with the humble projector — self-adjustment and self-healing.
From the modern digital camera that automatically adjusts focus and exposure for maximum reproduction, to high-tech engine management systems in sophisticated automobiles, the self-adjustment concept has well and truly taken hold. BMW’s intelligent drive system is a case in point as the car adjusts the gear selection to better suit one’s driving style. The variable timing of the intake valves and the engine fuel injection are optimised within tolerable limits, significantly improving fuel consumption, and these critical variables even auto-adjust between services.
If process instruments had evolved in a similar fashion, we would see self-adjusting electromagnetic flowmeters — the diameter would vary according to the flowrate to produce the most accurate and reliable flow measurements. Level measuring devices that utilise the ‘time-of-flight’ principle, such as ultrasonic and radar, would self-adjust according to the properties of the ‘air’ through which the measurements are made — why not automatically compensate for the density, moisture or chemical composition of the gas in the vessel?
Today, the majority of process instrumentation still uses the 4–20 mA signal transmission (as opposed to digital fieldbus technology), and therefore every device has to be ‘ranged’ during the initial commissioning. However, smart sensors and pressure transmitters would self-adjust to maximise their accuracy within the operating range. And this would be of tremendous benefit to users.
If, for example, a pressure transmitter operates between 0 and 100 bar with an accuracy of 0.01%, one can determine the inaccuracy at any point in the total range. However, if the pressure being measured varies only between 50 and 60 bar, a clever instrument would self-adjust and scale itself to operate between 40 and 70 bar. The result would be a far more accurate instrument.
There are other benefits as well. A typical 0 to 100 bar pressure transmitter may be set to alarm if the reading drops below 10 bar or exceeds 90 bar. However, if the process normally operates between 50 and 60 bar, a process malfunction that causes a pressure spike of 70 bar will pass unnoticed. A self-adjusting instrument could pick this up right away.
In new plants, this issue is all the more obvious and the ramifications are manifold. In such cases, each new instrument in the plant is calibrated and the measurement range set to maximise accuracy. As the process settles down, the optimised bandwidth for every instrument continues to shift. This requires further calibration and recommissioning of the instruments to maintain optimal accuracy. This is a laborious, time-consuming process and it could take years for the plant to stabilise.
But it does not have to be this way. Imagine if the instruments had the ability to store data, analyse it periodically, register the low and high points over a period of time and self-adjust to operate in the optimum range. This will also eliminate the need to shut down plants in order to optimise the instrumentation. The resultant savings in time, human resources and money as well as the impact on product quality and batch consistency would be tremendous.
Why stop at self-adjusting instruments? Let’s consider self-adjusting processes. With the recent acceptance by process industries of fieldbus technology, both Profibus and Foundation Fieldbus, we are able to extract significantly more data from the processes than previously. For example, a differential pressure transmitter can provide the measurement for level or flow, as well as the process pressure and temperature.
Using this additional information plantwide, it should be possible for the control system to process the data and make the necessary process changes, reducing energy costs and saving processing time, without sacrificing quality. Why add too much flocculant, only to have to recover it later? Why cool pipework only to heat it again for hygienic cleaning (CIP)? Why keep filling a tank that never empties below half full?
Until now, instruments have been designed around the reactive or planned reactive philosophy. We now have the technology to move them up to the next level and make them proactive without any major impact on their cost. And after we cross the self-adjusting hurdle, what’s next? Self-healing. Car makers are developing tyre systems that can self-check and self-inflate in the event of minor leaks. I see no reason why we can’t have self-healing instruments.
*John Immelman, managing director, Endress+Hauser Australia
Katronic dual-channel flowmeters can be adapted to make the best of the worst flow conditions.
Space-based surveillance of emissions and predicted tighter regulatory controls mean that...
Sanitary temperature sensors are commonly used in many industries, but their calibration is...