How is bitshift >> used in the gradient example?
// Update the colors.
byte time = millis() >> 2;
for (uint16_t i = 0; i < LED_COUNT; i++)
byte x = time - 8*i;
colors[i] = rgb_color(255 - x, 255 - x, 255 - x);
it's shifting millis() by 2, I don't understand why. When I change the 2 to a higher number, it slows the gradient down. I don't understand the logic going on here though.
Edit: Oh, I guess it's trying to increase the value of millis() by shifting the value in binary to a higher value.