By using this site, you agree to our updated Privacy Policy and our Terms of Use. Manage your Cookies Settings.
463,162 Members | 763 Online
Bytes IT Community
+ Ask a Question
Need help? Post your question and get tips & solutions from a community of 463,162 IT Pros & Developers. It's quick & easy.

Data structure for storing serial port data in firmware

P: 6
Hi all,

I am sending data from a linux application through serial port to an embedded device.

In the current implementation a byte circular buffer is used in the firmware. (Nothing but an array with a read and write pointer)
As the bytes come in, it is written to the circular bufffer.

Now the PC application appears to be sending the data too fast for the firmware to handle. Bytes are missed resulting in the firmware returning WRONG_INPUT too mant times.

I think baud rate (115200) is not the issue. A more efficient data structure at the firmware side might help. Any suggestions on choice of data structure?

Jun 13 '09 #1
Share this Question
Share on Google+
6 Replies

Expert Mod 5K+
P: 8,984
My experience (of which I have quite a lot) of receiving data on the serial port on embedded devices is that missing data is either the baud rate is too fast for the device or the device is not servicing it's data receive interrupt in a timely fashion.

It is actually quite hard to tell the difference between these 2 faults, you have missed data did it arrive too fast or did you read it too slowly? That's 2 sides of the same coin. So you have to assume the second and if you are still loosing data once you have done everything you can then you try a lower baud rate.

One of the assumptions is normally that the data will not be a constant stream, that is there will be gaps in the data during which time embeded application can perform processing on the received data. The problem is quite different if the data stream is constant.

If you are not already using an interrupt driven receive on the UART then you should do this, using an interrupt helps to ensure that when data is received by the UART the firmware then actually reads the UART.

So this leads to the following ways they software could cause the loss of data.
  1. Failure to service the interrupt routine. An interrupt routine can be stopped from running either by another interrupt routine (on a platform that does not allow nested interrupts which is a large portion of them). For instance in the last year I was using a platform that had a particularly unless OS on it, the task scheduler for the OS took longer to run than the time to receive a character and ran in an interrupt routine which resulted in the loss of data when receiving large quantities of data in a block until I realised the problem.
  2. Buffers too small. You are using a circular buffer which is a common and fine thing to do but by no means the only option. However whatever buffer option you use if you buffer is too small for your largest use case that is the largest amount of data you are likely to receive in a block then you are going to loose characters when you run out of data because unless what you have to do is with the data is extremely simple or you have an extremely powerful processor you will spend more time processing the data than it takes to receive it. That is you are normally reliant on the gaps in the data stream to give the embeded processor time to catch up with processing the received data you have to be able to buffer enough data to span the largest expect burst of data between these gaps. As a rule of thumb I try to guestimate this size and then double it.
  3. Program logic error. The code written has a logic error in it that results in it loosing data or state.

It is unlikely that a circular buffer written correctly would cause a problem in itself as long as it is large enough for the job in hand.

It is important that you don't try to perform all your data processing in your data receive routine, you are unlikely to have time you need to buffer the data for later processing.

Checking that your interrupt routine is being executed in a timely fashion is quite hard to do. IIRC last time I had to do this I set a GPIO line high at the start of the interrupt and low at the end and then used an oscilloscope to follow the value of the pin. Which lead to the discovery of the processing gap and then doing the same for the other interrupts in turn lead me to the interrupt controling task scheduling.

Remember interrupt routines should be short and simple, if they are more than a page or 2 of code or contain complex control structures (switches or loops) then you should start being suspicious.
Jun 13 '09 #2

Expert 10K+
P: 11,448
Assuming that the UART can receive bytes at the correct baud rate (115200 bps) then the processor has to deal with the received byte every 8/115200 seconds. Depending on what it has to do with that byte you can calculate (given the clock speed of your CPU on the device) if it can handle everything fast enough.

A circular buffer is fast enough but can the data be copied fast enough to elsewhere?

kind regards,

Jun 13 '09 #3

Expert 100+
P: 2,418
Some serial chips interrupt on each character; others have an onboard fifo and interrupt on every block. The advantages of a block interrupt is that it reduces the number of interrupts -- interrupt context switches are more expensive that subroutine calls. Disadvantages of a block interrupt are (a) you may need to empty the entire block before the next character is received; and (b) if the last byte of the message doesn't fill the block then you may be faced with a long wait for the interrupt that tells you to fetch the last bytes of the message.

Can you tell if the overflow is occurring at the serial chip or if it is occurring in the driver's FIFO?
Jun 13 '09 #4

P: 6
If the fifo is full, as in the block is yet to be read and new data is already been sent, will that data be lost? or will the sender wait?
Jun 15 '09 #5

P: 6
Thanks a lot for the patient answer.

I am using your and others inputs and looking into it. I will get back when I make some headway.
Jun 15 '09 #6

Expert 100+
P: 2,418
How would the sender even know that the fifo is full? There are techniques for making this known to the sender:
  • "modem control" signals CTS and RTS can be used to force the sender hardware to wait
  • software protocols like X-ON and X-OFF can be used to ask the sender software to wait
However, it is common for no technique to be in place to make the sender wait. In that case a byte is lost (either the most recently received byte or the oldest entry in the fifo is discarded) and an overflow status bit is set.
Jun 15 '09 #7

Post your reply

Sign in to post your reply or Sign up for a free account.