Making the recever "find" the correct channel

What i would like to do is set the radio channel in my transmitter app and have the receiver “find” and lock on to the transmitted channel. I’ll include my actual transmission channel number in the packet structure, and I’m hoping the receiver can easily just scan through ALL the channels. As soon as soon it gets close enough to get a good CRC on one of my packets, the receiver can then take the exact channel right from the transmitted message.

So based on the spectrum anylyzer I’ve seen in another post, it seems I can pretty much set up for receiving, and change the CHANNR variable anytime I want? Could someone verify this is true or if I’m missing something? Here is my proposed code. I understand the seek time could be improved by doing something more robust than incrementing a channel number by “1”, but I’m just trying to verify whether changing CHANNR is all I need to do. My initialization will be something like this, and is based on testRadioSignalRx.c

void receiverInit()

    CHANNR = 128;            // any initial value
    PKTLEN = RADIO_PACKET_SIZE;   // my expected pre-defined packet size
    MCSM0 = 0x14;    // Auto-calibrate when going from idle to RX or TX.
    MCSM1 = 0x00;    // Disable CCA.  After RX, go to IDLE.  After TX, go to IDLE.
    // We leave MCSM2 at its default value.

    // this DMA config is always the same for DMA->radio transfers = 19; // WORDSIZE = 0, TMODE = 0, TRIG = 19 (always 19 for RADIO) = XDATA_SFR_ADDRESS(RFD) >> 8; = XDATA_SFR_ADDRESS(RFD); = (unsigned int)packet >> 8;   // my actual packet adr = (unsigned int)packet; = 1 + RADIO_PACKET_SIZE + 2;  // my expected packet size + 2 = 0b10000000; // Transfer length is FirstByte+3 = 0x10; // SRCINC = 0, DESTINC = 1, IRQMASK = 0, M8 = 0, PRIORITY = 0

    DMAARM |= (1<<DMA_CHANNEL_RADIO);  // Arm DMA channel
    RFST = 2;                          // Switch radio to RX mode.


So with that done, I’m hoping all I will have to do is form a “detect()” function loop, where I listen until I get a good message, and permanently switch CHANNR, and I’m done, like the below?

void detect()  // detect my packet and switch to that channel permanently when found.
	uint16 timeNow = getMs();
	static uint16 timer= 0;
	uint8 channel = 0, try;
	while(1) 	// do forever till we detect.
		CHANNR = channel;
		channel = channel++; // keep revolving 0->255 and back
		// listen 4 times if bad CRC, 100MS between tries.
		for (try = 0; try < 4; try++)
			while (getMs() - timeNow) < 100)); // retry every 100mS
			timeNow = getMs();
			if (!(RFIF & (1<<4))) break;  // nothing at all... try a different channel
			if (!radioCrcPassed()) continue; // BAD crc.. try again

			 // got a good CRC, so extract my embedded channel 
                         // assume pPacketStruct already -> my packet structure
			CHANNR = pPacketStruct->myChannelID;


I modified test_radio_signal_rx and test_radio_signal_tx a while ago to step though the channels to see which channel had the best performance. Yes you can simply change the channel on the fly by setting a new vale for CHANNR.



I don’t think that changing CHANNR would have any effect until the radio recalibrates its frequency synthesizer, so you have to somehow make sure that happens. The code we provide for the Wixel generally ensures that the calibration happens periodically so maybe you can just change CHANNR like Mike did.

The radio interface in the CC2511 is quite complicated, as you can see if you look at Figure 54, the Complete Radio Control State Diagram in the datasheet. You want to make sure that you get the radio to go through the calibration state (number 8) after changing CHANNR.



AH the re-sync! that’s what was wrong! OK, well lets see if I can trick the radio logic. I already have this in my initialization…

MCSM0 = 0x14; // Auto-calibrate when going from idle to RX or TX.
MCSM1 = 0x00; // Disable CCA. After RX, go to IDLE. After TX, go to IDLE.

So normally I’d then just go to receive mode…

RFST = 2;

So I wouldn’t have to do anything until I actually got “something”, after which I’d end up in IDLE mode. So I’d re-enable clear IRQ_DONE, re-arm the DMA channel, and go back to receive mode as usual.

RFIF &= ~(1<<4); // Clear IRQ_DONE
DMAARM |= (1<<DMA_CHANNEL_RADIO); // Arm DMA channel
RFST = 2; // Switch radio to RX mode.

but now, at least during start-up, a failed receive could mean I need to switch channels, which means I’ll never get to IDLE state, and so won’t automatically re-synch. Well my transmit application sends a minimum of 1 packet/100mS when it has nothing “new” to send. So after I fail to receive anything for some time (lets say 1/2 second), lets say I force IDLE mode, change my channel, then force receive mode again…

RFST = 4; // go to idle mode
CHANNR = someNewSeekChannel; // this would increment/ change somewhere
RFST = 2;

Then try again for another 1/2 second and repeat as needed. Well this actually DOES seem to work, though the overall logic needs improvement. For example, I do transmit my actual channel number within my packet data, so once I get a good packet I can repeat the above once more, this time with the actual channel. Second, with 256 channels to choose from, I think I’ll be pretty safe to just offer maybe 16 channels, thus incrementing CHANNR by 16 on each unsuccessful pass. That should make the actual “time to find” shorter. So 1/2 second of listening X 16 possible channels is only 8 seconds max for the receiver to “find” the transmitter. Really, maybe I don’t even need THAT many channels.

So do you see any major “gotcha” in my logic? Also, is the slight 10 mS delay I’m putting between state and radio variable changes really needed for the CPU to get its own state machine in sync with mine, or does that matter at all? It was just a guess.

It sounds like you are on the right track and I am glad you were able to get it working.

Your first 10ms delay (before setting CHANNR) is probably not needed, and maybe you don’t need the second one either. I would try taking them out and see if the system still works. If you do need a delay, I think it should be better to have a while loop that waits for the radio to get to the desired state by reading the MARCSTATE register.

In most of our Wixel applications, we configure the radio to have a receive timeout to guarantee that it will go back into idle mode regularly, thus forcing it to re-calibrate the frequency synthesizer.

Here is the code form radio_mac.c that sets up the timeout:

void radioMacRx(uint8 XDATA * packet, uint8 timeout)
    if (timeout)
        MCSM2 = 0x00;   // RX_TIME = 0.  Helps determine the units of the RX timeout period.
        WORCTRL = 0;    // WOR_RES = 0.  Helps determine the units of the RX timeout period.
        WOREVT1 = timeout;
        WOREVT0 = 0;

I forget if there is some register that controls what state you go to when the RX timeout occurs, or maybe you just always go to IDLE.


Thanks Dave. It does seem to work without the timeouts, but without digging deep enough to figure out CPU cycles needed for internal state transitions to complete, I’ll probably keep at least the second delay. I certainly can spare it during this transmitter detect phase at startup. I will look at that MARCSTATE register though.

Now in my app the receiver is ONLY receiving (there is never any message acknowledgment going on). So I’m inclined to think it will be easier for the transmitter to run show if I don’t ever use the timeout feature, and be open to packets whenever they come in. Of course I’m kind of creating my own timeout in the app code now, but that’s a LOT easier to debug. And yes… that radio state diagram does look a little scary.

This is turning out to be a VERY useful project/product by the way! In fact I’m already starting my own PCB layouts! :slight_smile: I knew what I wanted to do with the Wixel when I found it, but at this point it has far surpassed ALL my expectations.

I am glad you are making progress and are happy with the Wixel. Please let us know more details about your project when it is ready!