Good point. maybe I'll check on the Ti forums about this too. Perhaps there are recommended resistance values when you need to measure through a voltage divider. On that note, I do know I'm currently doubling the error, because I'm using a 1/2 voltage divider to measure a battery that can never be more than 4.2V. Instead of two 10K resistors, perhaps I'll try a combination that gives me a 3/4 ratio, so that 4.2 will = 3.15 instead of 2.1. Right there that should cut the error % some.
I'm having a tough time gauging how critical this is. It is possible to over-think this stuff too. Its purpose is to warn a user when there are only a few hours left of life in the cell, to alert him/her to re-charge as soon as convenient. If you've ever looked at the discharge curve of a LiPO cell, Its pretty flat up to a a certain point, and after that its as as steep as a cliff. Thats why simply waiting for adcReadVddMillivolt() to drop didn't work out for me. By the time that voltage started dipping by 100mV, the warning would only be good for about 10 minutes at best, before shutdown was mandatory (I have a separate precision voltage monitor circuit to force that). I've done my own discharge study of the cell I'm using, based on the actual drain of the wixel running the app I've written. From that i can see that the more accurately I can measure cell voltage when its between 3.8 and 3.7 volts, the better chance I have of allowing maximum hours use before the warning starts, while still allowing a couple of hours "grace" after the warning. Its not an exact science (there's battery age, temperature, etc.). But reliably measuring within 50mV of the truth would ease my mind.
Since the cell is charged by a special management chip that stops exactly at 4.2, perhaps I can write a routine to let the user cause a self calibration the first time its turned on after a charge. Or I suppose I could calibrate each one individually with a saved variable or param_variable.