"Mark McIntyre" <ma**********@spamcop.net> wrote in message
news:91********************************@4ax.com...
On 1 Oct 2005 13:14:05 -0700, in comp.lang.c , "lucifer"
<um**********@gmail.com> wrote:
actually i am transmitting data over wireless medium so i
have to control the data ie slow it down
I think you have a bigger problem if you need to slow the data
transission rate down because of this. Your algorithm is wrong.
How about a queue which signals or blocks when its full?
thats i have to get a delay in the transmitting
function inversely to the speed of the the processor
You /definitely/ have the algo wrong....
--
I agree that "lucifer" does have a ways to go.
There may be some reasons to not over stuff a wireless protocol stack with
too much data.
------
[OT content start]
Bluetooth has a defined by annoying behavior that when the connection is
lost with data in the lower levels of sending or receiving stacks that data
is discarded and never reaches the receiving application.
Depending on just what Bluetooth profile that sending application is using
it may not have access on just how much data was actually discarded.
A specific example of this occurs when the sending application closes the
Bluetooth virtual serial port when the system serial write file returns.
In many Bluetooth stack implementations the virtual serial port driver
returns when the buffer has been sent to the Bluetooth stack, not when the
stack has delivered the data to the receiving application.
When the Bluetooth virtual serial port is closed too soon data will be
discarded and the sending application will not be notified.
This example is of off topic in this group but is offered as a thin
justification as to why someone may need a standard C method to limit the
maximum through put of a function.
[OT content end]
------
One method (lots of hand waving) would be to create the sending function
that counts the number of octets in the buffer to be sent.
Calculate an estimate of how much real time should be required to send that
buffer of octets.
Take a system time stamp before calling the send API.
Take a second system time stamp when the API returns.
Subtract the first stamp from the second giving the API delta time.
Subtract the API delta time from your estimate, then hold off sending
another buffer until that much time has passed.
Exactly how to do this would, of course, be an exercise for the Original
Poster.