I have a Murata DMS-30PC volt meter that I would like to use to measure dc volts on a power supply.
I need to scale the voltage down from 0-60v to 0 - 0.2v for the meter. Could I use a potential divider or would an op amp be more accurate? Could you please advise me which op amp would be suitable?
Welcome to the Forum!
Relating to the question above, may I ask why you are using 0 - 0.2V meter for measuring a 0 - 60V?
It seems the easiest solution would be to have a meter that is appropriately scaled to your application. Looking at the same DMS-30PC series, I do see they have a 0-200V system that would be better suited.
I have this left over from something else and just wondered if it was possible to use it.
Yes, an op amp would be the best bet, as these have high input impedance. Any general op amp should work. I will have a more updated response tomorrow.
Datasheet for reference:
If you can afford the decrease in accuracy, you can use a 1000:1 ratio resistive voltage divider to give the DPM a +/-200V range (0.1V resolution).
The DPM has a full scale accuracy of +/-1.5% and between the DPM’s 10:1 input impedance variation, 1% resistor tolerances, and limited value selections, figure the DPM will decrease to +/-3%. If you have the calibration gear, you could include a trim pot in the divider and calibrate the system to a higher accuracy.
thanks for your help! I will do what you recommended.