Best accuracy of wixel adc

Assuming I am correctly calling adcSetMillivoltCalibration(adcReadVddMillivolts()) occasionally, and converting raw voltage readings on a P0 port using adcConvertToMillivolts(adcRead(MY_ADC_CHANNEL)), what is the best accuracy I should hope for. I don’t want to make a long post unless you need it to verify my code, but after testing with 3 wixels I’m finding them all to be between 40-80 mV high when reading a known 3.00 V source, verified with two separate fluke DMMs. This is not horrible, only around 2% off. But as i said they all seem consistently high. Is that about correct, or do my findings seem suspicious?

Hello.

We have not done any experiments to characterize the ADC on the CC2511F32, so all the information we have about its accuracy can be found in the datasheet. Errors from both the ADC and the internal voltage reference used by adcReadVccMillivolts could contribute to the error you are seeing.

–David

Still, if indeed I’m finding the error to be consistent enough to correct, the place to do it would be to replace adcReadVddMillivolts() with my own, in which a corrected multiplier and/or zero offset was supplied, correct?

Depending on which part of the procedure causes the error, you could make new versions of adcReadVddMillivolts and/or adcConvertToMillivolts to reduce the error. To say whether this is a good idea or not depends on your application. If you want your code to work reliably on thousands of different units and across a wide range of temperatures, then keep in mind that the results you are seeing today with these three Wixels are not necessarily representative. By adding correction offsets to the code you might actually make the errors be worse under other conditions. I think it would be better to come up with an upper bound of how large the error could be based on the information in the datasheet, and then whenever you are using the ADC to make any kind of important decision, you would account for the possibility of that much error.

–David

Good point. maybe I’ll check on the Ti forums about this too. Perhaps there are recommended resistance values when you need to measure through a voltage divider. On that note, I do know I’m currently doubling the error, because I’m using a 1/2 voltage divider to measure a battery that can never be more than 4.2V. Instead of two 10K resistors, perhaps I’ll try a combination that gives me a 3/4 ratio, so that 4.2 will = 3.15 instead of 2.1. Right there that should cut the error % some.

I’m having a tough time gauging how critical this is. It is possible to over-think this stuff too. :frowning: Its purpose is to warn a user when there are only a few hours left of life in the cell, to alert him/her to re-charge as soon as convenient. If you’ve ever looked at the discharge curve of a LiPO cell, Its pretty flat up to a a certain point, and after that its as as steep as a cliff. Thats why simply waiting for adcReadVddMillivolt() to drop didn’t work out for me. By the time that voltage started dipping by 100mV, the warning would only be good for about 10 minutes at best, before shutdown was mandatory (I have a separate precision voltage monitor circuit to force that). I’ve done my own discharge study of the cell I’m using, based on the actual drain of the wixel running the app I’ve written. From that i can see that the more accurately I can measure cell voltage when its between 3.8 and 3.7 volts, the better chance I have of allowing maximum hours use before the warning starts, while still allowing a couple of hours “grace” after the warning. Its not an exact science (there’s battery age, temperature, etc.). But reliably measuring within 50mV of the truth would ease my mind.

Since the cell is charged by a special management chip that stops exactly at 4.2, perhaps I can write a routine to let the user cause a self calibration the first time its turned on after a charge. Or I suppose I could calibrate each one individually with a saved variable or param_variable.