We use cookies to provide our visitors with an optimal site experience. View our privacy notice and cookie notice to learn more about how we use cookies and how to manage your settings. By proceeding on our website you consent to the use of cookies.
I have a Murata DMS-30PC volt meter that I would like to use to measure dc volts on a power supply.
I need to scale the voltage down from 0-60v to 0 - 0.2v for the meter. Could I use a potential divider or would an op amp be more accurate? Could you please advise me which op amp would be suitable?
Welcome to the Forum!
Relating to the question above, may I ask why you are using 0 - 0.2V meter for measuring a 0 - 60V?
It seems the easiest solution would be to have a meter that is appropriately scaled to your application. Looking at the same DMS-30PC series, I do see they have a 0-200V system that would be better suited.
Yes, an op amp would be the best bet, as these have high input impedance. Any general op amp should work. I will have a more updated response tomorrow.
If you can afford the decrease in accuracy, you can use a 1000:1 ratio resistive voltage divider to give the DPM a +/-200V range (0.1V resolution).
The DPM has a full scale accuracy of +/-1.5% and between the DPM’s 10:1 input impedance variation, 1% resistor tolerances, and limited value selections, figure the DPM will decrease to +/-3%. If you have the calibration gear, you could include a trim pot in the divider and calibrate the system to a higher accuracy.