Pressure Sensor Terminology

When it comes to pressure sensors there is a lot of confusion on the different terms/specifications used. I have complied a list to help explain some of these terms and how they apply to pressure sensors.

Absolute Pressure - (PSIA) - This type of sensor measures pressure relative to a perfect vacuum. In other words, at sea level when the pressure port is exposed to the atmosphere it would measure 14.7psi (14.7 PSI = 1 Atmosphere). The output will change as the atmospheric pressure changes.

Gauge pressure - (PSIG) - This sensor measures pressure relative to ambient pressure. The back of the sensing diaphragm is exposed to the atmosphere through some sort of vent. Any change in atmospheric pressure will affect both the vent and the pressure port resulting in a steady output.

Sealed Gauge - (PSIS) - The sensor measures pressure relative to a sealed atmospheric pressure. It is like absolute as the back of the diaphragm is sealed off. However, it is sealed at roughly one atmosphere.

Differential - This sensor measures pressure at one port in comparison to the pressure at a second port.

Full Scale Pressure - the maximum rated operating pressure.

Pressure Range - The algebraic difference between the maximum pressure and the minimum pressure over which the device is calibrated.

Proof Pressure - The specified pressure which may be applied to the sensing element of a transducer without causing a permanent change in the output characteristics. If exceeded, the sensing element may be distorted (operates but no longer in spec) or destroyed.

Burst pressure Rating - the maximum pressure applied to a transducer/sensor with our causing catastrophic failure. This rating should never be exceeded. Proof or burst pressure is often listed as a multiple of the full-scale pressure.

Linearity - The maximum deviation of the calibration curve from a straight line between zero and the full scale. This is expressed as a percent of a full-scale output.

Repeatability - The ability to produce the same output with consecutive applications of the same pressure.
hysteresis - The maximum difference of the output between the min-max back down to the max-min pressure. the deviation is expressed as a percent of full scale.

Accuracy - The only specification that we list on our site concerning the precision of a pressure sensor. This can mean two things so it is important to check the data sheet.

  • Typical Accuracy - This is a combination of linearity, repeatability, and hysteresis at room temp.
  • Total Error - Max allowable error for each linearity, repeatability, and hysteresis over operating temp.
2 Likes