There is an important tip I wanted to share. Please read down through to the problem until end to see the solution.
Recently I built a milli-ohmmeter with a panel display which is pictured above.
There is a video how to make such a device similar to mine here:
What I discovered after installation is the device would not show the correct value, even though the constant current was accurate and the device was calibrated with the potentiometer. So you know, I was using "Kelvin" four-wire setup, not the two-wire system the video shows.
Because this voltmeter has a separate power (red) and measurement wire (yellow), but one common ground (black). I felt that the power circuit and measurement circuit needed to be separated to avoid some sort of DC voltage induced across the black wire.
I proceeded to separate the supply into two parts, one battery powering the voltmeter display (V+ and V- inputs) and the other battery supplying power to the constant current (I+ and I- inputs). This significantly improved the overall results by eliminating the unwanted voltage drop issue. The on/off switch was changed to double-pole for one switch to control both circuit halves with two supplies.
Unfortunately, my device suffered from a non-zero offset when not connected to circuit, showing 0.0020 when open circuit and 0.0010 when +V and -V were shorted together. It's normal for a unterminated voltmeter to show some open circuit voltage, especially when test leads are connected, which accounted for half of the offset. The other half not only showed up at zero, it also showed up in my measurements (such as a 0.100 ohm precision resistor reading 110 milliohms).
What I determined is the voltmeter had the 0.0010V DC offset coded into it somehow. No matter what I did to modify my circuit or take it out of circuit and test it, that offset was always there. So I went about searching the internet for a solution. Sure enough, the answer was out there!
I found this post which described a power-on zero calibration routine which worked perfectly in my case:
I temporarily shorted the two through holes to the left and right of the "I_ADJ_Z" silkscreen, applied power, and the display did a little dance for a second or two. Then the desired 0.0000 magically appeared on the display! I was so delighted.
I proceeded to precisely adjust the device's display to my portable voltage standard by adjusting the trim pot on the back of the voltmeter board, since the offset changed the calibration.
Now the device works perfectly. When my resistor is 100 milliohms, it shows 0.0100 in the display. Note: Reading the milliohms on this device requires ignoring the decimal point, due to the 100mA constant current supply. By ohms law, 0.0001 volts is developed across a 1 milliohm resistor when current is 0.1 A. The math is V = R*I = 0.001Ω * 0.1A = 0.0001V. This is why I chose a meter with four digits of precision.
Here you can see the prototype testing a USB extension cable round-trip with a shunted connector at the far end. The total resistance is 601 milliohms.
Here you can see two kelvin clips attached, the reading is 0 milliohms. Before calibrating/zeroing the voltmeter offset, it was showing 10 milliohms. The four wire connection is a I+ and V+ on the red clip jaws, I- and V- on the black jaws. The jaws are mechanically joined, but electrically isolated. Therefore, the resistance measured is at the tips only.
This prototype case is simply an old Rolodex box, once used to store business cards back in the day. Everything attached with sticky tape, and the 4P header can be used with various adapters and cables. Later I can decide what type of 3D printed box I might want to create and which connector attachment is preferred, then the unmodified proto box can be used for the next project. The heatsink on the adjustable voltage regulator keeps the reference voltage steady, along with large current limiting resistors, so that the device provides a constant 100 mA without drifting due to resistive heating during operation.
The tested capability of this circuit using the 5V source for constant current tops out by 20 ohms (2.0000 displayed). Side note - this ceiling can be increased by using a larger voltage source (Rmax=10*Vin-30) , but that results in generally higher temperature rise due to the increased voltage drop across the LM317, causing more Vref drift. Above 15 ohms a typical DMM will do just fine anyway. Keep in mind the power delivered to the resistor is R/100 watts. Higher value resistors will heat up and change the value.
Just wanted to share this "secret" voltmeter zero offset tip, in case you have the same situation with a voltmeter module. Keep in mind, your module may be different than the one I used or the device in the thread I linked. Try at your own risk and be safe please. I recently attempted this on another volt/amp/watt meter made by Deek Robot, and it brought up a calibration menu which required I attach 100V and then 5A of current! Fortunately I had something like that.
These menus are probably designed for factory calibration because there is no official description of the calibration process elsewhere.
Having a calibration routine allows the manufacturer to source inexpensive components from varying sources over time. Regardless what is installed on the PCB, the factory calibration routine is a "catch all" so that they can continue shipping working devices through all these changes.
Sometimes, their calibration process can let bad product slip out the door anyway. For instance many bad devices can ship to customers if the calibration source supplies incorrect voltage or current, especially if they are off by a factor of 10X and it slips through final inspection because it "looks right".
If you want to see an instance where this 10X issue actually happened, check out Julien Ilett's discovery at the following link: