Sensors are common devices used to detect a change in a physical state and quantify the measurement results in a particular scale or range. In general, sensors can be classified into two types - analog and digital sensors.
So which one is better in your application? Let’s take a temperature sensor as a simple example.
The difference between analog and digital sensors
A temperature sensor with an analog output, such as the TMP235, uses the transfer function to determine the temperature. A sensor with a digital output, such as the TMP112, does not require the system to know or program the internal analog transfer function to determine the temperature.
Sensor Applications
In fact, systems with an analog sensor require an ADC to digitize its output and use a look-up-table to determine the temperature. Digital sensors already produce the digitized output of the measured temperature which can be read back over a digital interface (e.g. I2C).
Calibration requirements
For analog sensors, gain and offset of the ADC may need to be calibrated to achieve the desired system accuracy. System temperature accuracy is not guaranteed in the datasheet as it is heavily dependent on the ADC reference error. Digital sensors don’t need to be calibrated in order to get the accuracy that is guaranteed in the datasheet.
General guidance
In a general rule of thumb, digital temperature sensors are preferred in almost all cases due to lower system integration complexity and higher out-of-the-box performance. Exceptions where a digital sensor may cannot be used and an analog sensor is necessary include a lack of a suitable available digital interface bus or cost constraints.