Not strictly a robotics project, but I figured I’d share. We’re setting up an instrument where we take light coming out of a telescope, split it into two beams using a beamsplitter, filter one for the IR, one for the UV, and use two separate cameras to image the light. We’re trying to take 200 frames/sec, and keep the two cameras in complete lock-step. Electrically, you can sync the two cameras. But as far as the software goes the cameras use a gigabit ethernet interface, so they inherit all the potential problems as any other ethernet-based device: latency, dropped packets, etc.
To test that they can take data in lock-step, we needed a light source for our artificial star that would be on for one frame and off for the next, so the data files coming off the cameras should be matched pairs of images with light, then without light, then with light, then without, etc.
Rather than dink around with a PLL and a signal generator, I took the clock signal for the cameras (0-5V 50% duty-cycle square wave), used that as an input to a Baby-O, set up a pin change interrupt to pick up the state change on the input pin, and used that to drive the onboard LED. 200Hz isn’t even making it break a sweat, and we were able to do all the tests for camera synchronization.
It worked so well I’m using the same setup to do latency tests on the cameras starting tomorrow.
Yeah, I know a Baby-O was overkill for this (no DC motors, no external sensors, no nuthin’) but it’s a great prototyping platform, and let me go from dead-stop to a running system in half an hour. Gotta love it.
Another project I’m working on is to use an Orangutan to drive an aerial photography system that is currently RC-only. I’ll have a chance to test it in the air some time next week, and have plans to use it to take pictures of the humpback whale migration that comes through here in February and March.
I LOVE THIS STUFF!