Temperature-stable precision ADC

I’m looking for a temperature-stable precision ADC. I’m trying to measure a couple microvolts, but need to measure the same value regardless of temperature range.
I’ve been looking at precision ADCs with a built-in PGA, but I’m having trouble determining what I can actually expect.
The NAU7802 is atrocious - I tried it and got about 10 bits of noise. (I was expecting 7 bits of noise)
I’m leaning towards the MCP3421 or MCP3561. They claim they have good ENOB, but don’t really mention error vs temperature. There’s a whole list of ADCs to choose from.

On another note, the MCP3901 claims to be an “analog front end”, but appears to be an ADC rather than only signal conditioning. Is the difference that you can program oversampling for anti-aliasing, as opposed to a pure analog anti-aliasing filter?

Hello Marshal,

There are a lot of options to choose from on the ADC’s. One option you may look at is the 296-48346-1-ND from TI. The accuracy is very stable over temperature, drifting only 0.01 LSB/°C. We also have an article from our article library referencing the part. Click Here

As far as the MCP3901 I believe that you can program it separately, the manufacture also has a GUI that you can use to modify and capture the data.

This is a step in the right direction.
Do you have something that can get me down to 1 or 2 microvolts?


It might be helpful to start by diagramming the signal chain and potential error sources therein, as well as establishing design requirements with greater specificity than related above.

You’ve indicated that you’re trying to “…measure a couple microvolts…regardless of temperature range…” It would be more helpful to phrase that as saying, for example, that you’re looking to achieve measurement repeatability to within ±1uV over a system temperature range of -40 to +85°C.

If that’s actually the scale of what you’re after, you’ll probably need to be considering a lot more than ADC selection. Factors as mundane as the thermocouple effect caused by differences in the element and lead materials of a resistor can introduce microvolt-scale errors with temperature gradients of less than 1°C. The system voltage reference will drift with time and temperature as will the gain and offset errors of a converter and any amplifiers used, thermal and 1/f noise sources alone might well add up to microvolt levels depending on signal source and measurement bandwidth, and simply clocking data out of a converter can introduce microvolt-scale signals into a system due to interconnect impedances. Microvolts are messy…

Also, the notion of an “analog front end” is a somewhat nebulous industry term, often used to describe a component integrating signal processing, interface, support, and/or and data conversion functions in a configuration that contemplates some relatively specific application or end use, such as interrogation of bridge-configured sensors, electrocardiography, or ultrasound probes.

1 Like


Thank you for your reply. It looks like I need to hook up an instrumentation amp and investigate my sources of noise and error.

I’m trying to measure 0.1C changes in temperature over a period of 40s. Currently I’m using an RTD, but I’m considering switching to a thermistor for higher sensitivity. I’d like resolution of 0.001C.

The expected temperature range of both the sensor and the ADC is 0 to 50C ( although I’m not measuring at the ADC, I’m measuring inside a tree trunk).

I’m currently using a wheatstone bridge to provide a differential signal.

I also had an outdoor strain gauge application in mind, thus the desire to measure microvolts, but I think it wasn’t a good plan to try to solve two problems with the same circuit.

It’s perhaps not so much a question of investigation, as much as understanding what the repeat offenders are and putting them behind bars preemptively. Otherwise the contributions of an unrecognized error source tend to look like data…

If temperature be the item of interest, I might suggest a higher end digital-output temp sensor such as the ADT7x20 or TMP117. Those will resolve to 0.0078 with initial accuracy of 0.x°C, can tie directly to a microcontroller for data logging, and come with a pretty good calibration that you don’t get with a roll-your-own solution.

Note that at atmospheric temperatures, 0.1°C implies ability to resolve the measured quantity at 4-digit (1 part in 10,000) levels already, as there’s 273 (and change) degrees below zero on the celsius scale that are sorta important… Divide that increment by another two zeros, and the chances of a person not actually measuring what they intend to be measuring become pretty significant.

Thanks for the idea.

I think for this design I’m going to stick with two-terminal 0603 devices for a sensing element, due to their ruggedness, small size, and ease of soldering. The probes are only 2mm in diameter, so we’re currently using an 0603 RTD soldered to a flex PCB, and glued to a small nail. I would be concerned that the tiny TMP117 would get scratched upon insertion into the tree, or break off from the PCB entirely. The ADT7420 is of course too large to fit.

(This is an R&D project for an HRM sapflow meter. If you have ideas for manufacturability, I’m all ears. We switched from thermocouples to RTDs to try to get better accuracy, but I’m starting to wonder if the RTD isn’t making good contact with the tree, despite being surrounded by sap.)

A different idea to measure small changes in the temperature is to use a band-pass filter before amplification, so we no longer know the absolute temperature but we can measure the changes in temperature without need of a high-precision ADC. However, this means I need an instrumentation amplifier per channel, which I was trying to avoid due to cost and size.

I guess I’m getting lost in the details here.

I saw that the TLA2024 is temperature-stable, but I didn’t see a way of searching for that. Is there some searchable property I can use to filter ADCs for temperature stability?

Makes more sense now that I’ve a better idea of the target application.

If I understand correctly, what you’re really after here is to be able to resolve small temp deltas in multiple, contemporaneous measurements that occur over minute-scale time frames. Not absolute accuracy or long term stability or even stability with regard to temp change so much, since your measurements are only concerned with what happens in the next minute or so, rather than measuring changes between now and a year from now. That actually offers some significant flexibility.

RTDs offer better linearity, thermistors more sensitivity. The latter might let you get by with a cheaper ADC, though probably would yield less consistent system behavior as a function of baseline temperature. Littelfuse and others are already in the business of making temperature probes, and can do customs if the opportunity is right. That’d be more of an off-line conversation if you’d care to have it.

I’d suggest an arrangement like that napkin-sketched below. The concept would be to use a 2+ input, high-res ADC referenced to the excitation supply and measuring the 2 intermediate points in a 3-element voltage divider created by two temperature sensors with the lowest-tempco resistor you’ve got budget for in the middle. That resistor functions as the reference for the system, which is measuring the proportion of the excitation/supply voltage appearing at the inputs, rather than the actual magnitudes of such. Assuming 1K all around, 16 bits should allow resolving to about 0.06°C with the 3850PPM/°C tempco common among platinum RTDs, if I ran the numbers right.


Assuming a 1K reference resistor value, there are OK, good, and really good options available. I like the LTC2493 as a higher-end ADC option that seems to offer hope of achieving good measurement resolution with the lower sensitivity of RTDs, though the Nuvoton part mentioned might suffice also–some filtering might help with the noise issue.

You mentioned a surface mount RTD; how was that electrically insulated? Sap would likely be conductive enough to mess with the readings if it wasn’t.

1 Like

I’m currently insulating the RTD with a thick conformal coat. It seems to work so far. I’d like to get them into a 2mm stainless tube - I haven’t figured out how to manufacture that reliably. I noticed that many RTD probes are 3mm or larger. Omega makes some that are $74 each - I’m hoping to keep the whole probe under $30. Unfortunately, that may mean we sacrifice durability.

I like the simplicity of your circuit, but it doesn’t work in my mind as you describe. It seems that the difference between CH1 and CH2 would decrease as both RTDs heated up, but this could not be distinguished from only one RTD heating up. This sounds like a sum rather than a ratio.