Hello all. I am interfacing my computer to the outside world for an
experiment and we would like to know how much time the calling function
takes. The specs say this should happen on the order of a microsecond,
which is faster than we need. We would like to call 30 or so of these
functions in 0.5 milliseconds and before committing to purchasing this
interface system, we would like to make sure it fits our needs. However,
we are having difficulties timing how long our functions take. We are
running Microsoft Visual Developer (using C) v 6.0 on a computer running
Win98. We tried something like the following
#include <time.h>
clock_t start, end;
double duration;
start=clock();
my_function_to_test();
stop=clock();
duration=(stop-start)/CLOCKS_PER_SEC;
It has become apparent that the resolution of clock() is only on the
order of a second. I have tried looking at the C FAQ and Google to find
something that would give me better resolution, but have failed. We are
thinking of looping our test function thousands or millions of times and
dividing the total duration by the total number of loops, but I would
like to get a better idea how long each function will take.
Thanks in advance
Kevin