Hello, I had a quick question regarding the VZCR pin. According to the instructions it is supposed to be the zero amp reference. This is useful because unless I force the current through the device to be zero upon boot, there is no way to get a good reference.
The problem I am encountering is that if I compare the output of the VZCR and the VIOUT when there is no current, they are different. Using 3.3V as VCC, and with nothing connected to the current side, VIOUT is 1.652V but VZCR is 1.634V, the difference is almost 1A of error.
Is this ‘normal’ in terms of variability in a device?
The percent difference of those two measurements is about 1%, which is within the typical error of the output of the chip. Do you see this difference when there is significant current going through the sensor?
Thanks for the response; as expected, the error between using the reference voltage and a zero average captured at the beginning becomes smaller and smaller compared to the increased current. In simple terms, the error between 10A and 11A is not as big as 0A and 1A. I have no easy way of measuring currents over 3-4A so comparing absolute error is difficult, but my concern is mostly for the low amperage readings and also not having to depend on zero current calibration at boot time.
I do see what you mean though, over the overall range the percentage error is within the spec so I can’t really complain -
You might consider using one of our other current sensors, like our ±5A ACS714 carrier. The higher sensitivity should help for measuring lower currents; however, it does not have a zero current reference pin.