I’m using the ACS714 current sensor and Arduino Mega2560 to do power measurements and also for RC servo control. It’s working, but accuracy of current measurement is not good. The more current that flows through the sensor, the worse the current measurement gets. I use a Wattmeter as benchmark, and confirmed measurements with a multi-meter where possible.
The voltage divider is same as the how2. R1=12k ohms, and R2=5k ohms. The breadboard voltage regulator is based on building a Mintduino, using a 7805 power regulator and 10uF capacitors. It takes in 12v and outputs ~4.94v. I use it as a low voltage power buss, supplying the ACS714, and some RC servos.
To get voltage measurement correct, I edited this line;
pinVoltage = avgBVal * 0.00705;
To get the amps measurement “close”, I edited this line;
Could accuracy with amps measurement be related to the voltage regulator? Either not supplying full 5v to the current sensor, or being too choppy or noisy? I’m not sure how to proceed with troubleshooting this… I’m very new to electronics, many things I don’t understand. But still learning slowly, any guidance is most appreciated. Thanks.
Can you clarify what you mean when you say that the accuracy of the measurement is not good? Are the sensor’s readings unstable or just different than what the wattmeter reads? If they are fluxuating, how much variance do you see? If they do not match what the wattmeter is reading, how different are the two measurements?
The current sensor’s readings depend on Vcc; since your regulator is supplying the Vcc voltage, it could be possible that it is causing the output of the sensor to fluxuate. Can you measure the voltage output of your regulator directly and see if it fluctuates?
Thanks for the response. With your feedback, I was able to get very close to wattmeter results. Not sure if I should expect better. Variance of ACS714 compared to wattmeter seems about 0.01 to 0.02 [+,-] under load, for both volts and amps. More like 0.1 variance at no-load.
To get to this point, I changed voltage regulators from the 7805 style to a Pololu D15V35F5S3. And I set the current measurement calculation like this;
I set battery voltage conversion multiplier like this;
With car-puter running, servos having power at the buss but no torque applied, and camera off:
Observing Vcc supplied to the ACS714, the voltage fluctuates from 5.05 (no load) to 4.96 (full load).
If I dial in the code to measure volts to its most accurate (eg making ACS714 follow wattmeter measurements under both load and no-load conditions) then the amps measurement varies more.
If I dial in the code to measure amps at the most accurate - following the wattmeter - then the volts measurement disagrees more. The code snips posted in this reply seem to be the best I can do for both volts and amps across the operating load range…
It seems like I will have to “fudge” the software calculations to some artificial sweet spot… Should I expect better accuracy? Do the tweaks I did in the code look reasonable?
It looks like you are getting fairly good accuracy. I have not read through the original code thoroughly, but with the kind of accuracy you are getting, I would say that your modifications to the code seem to be working fairly well also. I am not sure of what else you might try to increase your accuracy.