LSM303DLHC. Should there be a value's jitter?

I have several Pololu LSM303DLHC Carrier Boards (Item #2124).
When any of this board lying still on the table (in any orientation but without motion), there exists value’s jitter (approx. plus-minus 5 LSB units) on any of three axis of accelerometer.
There are no vibrating objects on this table and around it.

The Acc registers settings are:
REG1_A (20h) = 47h
REG2_A (21h) = 00h
REG4_A (23h) = 88h
The other registers are default (not initialised in code).

This effect is found when reading 6-bytes portions with a 10-ms period, 32 (or any) times successively.
I2C bus speed (400 or 100 kHz) is not affected.

Due to LSM303DLHC’s 50 Hz ODR and 100Hz reading I see sometimes a pair of the same value and a jitter values like this:
927
927

921
928
925
925

923

(on any axis, at any value jitter is within 10 LSB units)

There are no errors reports on I2C bus, sygnal integrity looks fine on oscilloscope (levels, rise & fall times etc.)
Uin and U I/O = 3.3V.

Is such value’s jitter a LSM303DLHC’s parameter (i.e. G-sensor accuracy or digital noise)?
If not, can I eliminate it?

Hello.

Have you converted the jitter you are getting to acceleration values and compared those values to the accuracy in the sensor’s datasheet? Those values don’t seem to have that much variation.

Hello. Thanx for reply.
You ask "Have you converted the jitter you are getting to acceleration values"
Yes, shure. The “Raw” accelerometer’s 16-bit code was shifted right for 4 bits. The LSB unit is 1 ug. The jitter is plus/minus 5-6 ug or approximately 0.6% full scale.

That’s the whole point I can’t to understand how to define “the accuracy in the sensor’s datasheet”.
There are two point in datasheet meet the term “G-sensor accuracy”.

In Table 3. Sensor characteristics:
Acceleration noise density 220 ug/sqrt(Hz) @ (ODR bit set to 1001, FS bit set to 00, normal mode),

And in Table 8. Accelerometer operating mode selection:
BW [Hz] = ODR/9 @ Normal mode.

There are no additional explanations in which frequency band an “Acceleration noise density” is normalized and what is “BW” in Table 8.
Assuming “BW” in Table 8 is “BandWidth” and it is applicable to “Acceleration noise density” I tried to calculate the noise dencity (ND):
For ODR = 0010 (10Hz): ND= 220/sqrt(10/9) = 232 ug/sqrt(Hz);
For ODR = 0100 (50Hz): ND= 220/sqrt(50/9) = 93 ug/sqrt(Hz);
For ODR = 1001 (1344 Hz): ND= 220/sqrt(1344/9) = 18 ug/sqrt(Hz);
I was not able to convert the spectral density in the maximum deviation in LSB, but it is clear that these values are different significantly (12 times). But:
When
REG1_A (20h) = 27h (ODR = 10 Hz),
REG1_A (20h) = 47h (ODR = 50 Hz),
REG1_A (20h) = 77h (ODR = 400 Hz) -
there are no change in value’s jitter - plus/minus 5-6 LSB.
And only at REG1_A (20h) = 97h (ODR = 1344 Hz) value’s jitter seems to increase up to plus/minus 9 LSB.
There are no such difference…

Whether the lower 4-5 bits of LSM303DLHC accelerometer’s 12-bit code contains a constant digital noise and can be ignored, or in my Acc registers’s settings something wrong?

P.S.
To avoid possible any vibration, on the concrete floor was placed inch-thick rubber mat, the 16-pound marble slab on it, then the LSM303DLHC Carrier Boards and top of a thick book.

I am using the LSM303DLM with the following register settings

	//enable accelerometer
	i2c_start(); 
	i2c_write_byte(0x30); // write acc
	i2c_write_byte(0x20); // CTRL_REG1_A
	i2c_write_byte(0x27); // normal power mode, 50 Hz data rate, all axes enabled
	i2c_stop();

	//enable magnetometer
	i2c_start(); 
	i2c_write_byte(0x3C); // write mag
	i2c_write_byte(0x02); // MR_REG_M
	i2c_write_byte(0x00); // continuous conversion mode
	i2c_stop();

Here is a dump of the raw readings from the mag and the acc with it sitting on the desk, not moving: MX,MY,MZ,AX,AY,AZ

Looking at AZ, this is about 120/17000 or < 1% variation, which is roughly what you seem to be observing. In more detail for 279 samples, the standard deviations of the above 6 columns are 4,4,5,31,31,49 respectively. For a $15 sensor, I’m happy with that.

Hi, Jim Remington
From your code:

According to Datasheet April 2011 Doc ID 018771 Rev 1, Table 20. Data rate configuration:
ORD= 0010 - Normal / low-power mode (10 Hz).
I.e. i2c_write_byte(0x27) - normal power mode, 10 Hz, not 50.
Do you have any other datasheet?

Thank you for your information: this is about 120/17000 or < 1% variation.
Value is really very close to those that I observed, then something like it is.

Please criticize my conclusion:
The G-sensor’s accuracy is close to 1% at any initialization.
The <3…0> bits of whole 12 contains a digital noise.
This means that only 8 bits of the 12 are significant.
Why then use 12-bit input values?
For microcontroller, calculations with the input 8-bit word length is much more faster than with 16-bit ones…

Someone please reply: Does the term “ODR” means the rate at which data is updated in the output register?
In this case at ODR=10 Hz, during 100 ms it shell be the same value for reading.
At reading rate of 100 Hz and Output Data Refresh rate (value change in register) 10 Hz I have to watch a dozen of the same samples sequentially, but I didn’t see it … 2-3 maximum. And it is rarely. More often, I see a row different samples. What’s wrong?

There is a bit BDU in CTRL_REG4_A (23h) - Block data update. 0: continuos update, 1: output registers not updated until MSB and LSB reading.
I tried both values, but did not notice a difference. Can anybody explain this bit?

I stated that my code was for the LSM303DLM, and the data sheet says that 0x27 in CTRL_REG1_A specifies 50 Hz, low power.

Jim Remington: [quote]my code was for the LSM303DLM[/quote].
Then it’s clear.

Probably, I think correctly, that LSM303DLM and LSM303DLHC have the same G-sensor.
Now, I leafed through the LSM303DLM’s datasheet and found many previously obscure things, such compliance ODR - Low-pass filter cut-off frequency etc.
Thanks for your notes. They helped. It is difficult to work alone. :smiley: