Why Accurate Calibration Method is Needed for Better Controls of Instruments

Many people do a field comparison check of two meters, and call them “calibrated” if they give the same reading. This isn’t calibration. It’s simply a field check. It can show you if there’s a problem, but it can’t show you which meter is right. If both meters are out of calibration by the same amount and in the same direction, it won’t show you anything. Nor will it show you any trending — you won’t know your instrument is headed for an “out of cal” condition. For a good calibration, the calibration masters should be better than the instrument has to be calibrated.

Calibration Method

The Calibration method of measurements requires instrument management. If one instrument said to be calibrated, it will be considered a method to ascertain management on instruments and find confidence.

The set of operations that establish, under specified conditions, the relationships between values indicated by a measuring instrument system or values delineated by material measure & the corresponding values of an amount complete by a reference normal is termed calibration.

Calibration methods involves checking the operational integrity of a test or measuring equipment or of a measurement standard of unverified accuracy in order to detect, correlate, report or eliminate (by adjustment) any deviation in accuracy, capability or from any other required performance. Calibration can be carried out for three possible purposes.

  1. Determining whether or not or not a selected instrument or standard is among some established tolerance in respect of its deviation from a reference standard.
  2. Reporting of deviations in measurements from nominal values.
  3. Repairing / adjusting the instrument or standard to bring it back among tolerance.

Why Accurate Calibration Method is needed

  • Controlling the method, relation between inputs & output can be controlled
  • To control the method, we need to know the status of the process
  • Accurate measurement gives information regarding status
  • Control of a process can never be better than the measurements made in the process
  • The more accurate the data obtained from the process, with more accuracy the process can be controlled.

Causes of Calibration Issues

What knocks a digital instrument “out of cal?” 1st, the foremost elements of take a look at instruments (e.g., voltage references, input dividers, current shunts) will merely shift over time. This shifting is minor and frequently harmless if you retain a decent calibration schedule, and this shifting is often what calibration finds and corrects.

But, suppose you drop a current clamp — hard. How do you know that clamp will accurately measure, now? You don’t. It may well have gross calibration errors. Similarly, exposing to an overload can throw it off. Some people think this has little effect, because the inputs are fused or breaker-protected. But, those protection devices may not trip on a transient. Also, a large enough voltage input can jump across the input protection device entirely. This is far less likely with higher quality, which is one reason they are more cost-effective than the less expensive imports.

Instrument Calibration Process for Pressure Transmitters

The Instrument calibration process of pressure transmitters has the difference in pressure between two different environments. In article, leading NABL approved calibration describes the method and steps of calibration for pressure transmitters under various circumstances.

Two level transmitters are connected to two columns and are used to measure the pressure of one column with respect to the other column. For example, the pressure from one water column will be subtracted from the pressure of the other water column to measure the differential pressure. An important reason of using this instruments calibration method is, an improperly calibrated differential pressure transmitter will not provide accurate measurements.instrument calibration process

Steps for Differential Pressure Transmitters Calibration Process:

Calibrating differential pressure transmitter instrument can be done in few short steps.

  1. Set up the differential pressure transmitter, HART communicator, power supply, hand pump, and the multimeter as below (see below calibration setup file).
  2. Make sure the equalizing valve manifold is closed.
  3. Apply a pressure to the transmitter equal to a lower range pressure (usually it correspond to 4 mA in the transmitter output). For example we have 0 to 100 mBar calibrated range, then the lower range pressure is 0, or let’s say we have -2 psig to 5 psig then we have lower range pressure equal to -2 psig.
  4. Read the pressure in the transmitter LCD (or in the HART communicator). Adjust (if any) through the HART communicator so that the output of the transmitter (on LCD) is the same with the applied pressure.
  5. Read the mA output of the transmitter by using a multimeter. Adjust (if any) through the HART communicator so that the output of the transmitter (on multimeter) is 4 mA.
  6. Apply a pressure to the transmitter equal to an upper range pressure (usually it correspond to 20 mA in the transmitter output).
  7. Read the pressure in the transmitter LCD (or in the HART communicator). Adjust (if any) through the HART communicator so that the output of the transmitter (on LCD) is the same with the applied pressure.
  8. Read the mA output of the transmitter by using a multimeter. Adjust (if any) through the HART communicator so that the output of the transmitter (on multimeter) is 20 mA.

Typical tools required:

  • 24 VDC power supply
  • Multimeter digital
  • Pneumatic hand pump (up to 600 psig)
  • Hydraulic hand pump (up to 10.000 psig)
  • Low pressure hand pump
  • High precision digital test gauge
  • HART communicator
  • Screwdriver toolkit