By using this site, you agree to our updated Privacy Policy and our Terms of Use. Manage your Cookies Settings.
424,853 Members | 1,012 Online
Bytes IT Community
+ Ask a Question
Need help? Post your question and get tips & solutions from a community of 424,853 IT Pros & Developers. It's quick & easy.

High Resolution Callback Timer (microseconds)

P: n/a

Anyone,

I'm looking for a way to setup a callback function in our VC++ .ne
2003 7.1 console application using a higher resolution timer. Th
multimedia timer is too low resolution at 1 millisecond. We nee
microsecond resolution to guarantee an exact update rate (60 or 70 H
for example.)

We use the Performance Counter to get microsecond resolution whe
measuring elapsed time, but there doesn't appear to be any tools fo
generating callbacks or interrupts, etc. at that higher resolution.

Any ideas would be greatly appreciated. Thanks,

Mik
-
Michael Evan
-----------------------------------------------------------------------
Posted via http://www.codecomments.co
-----------------------------------------------------------------------

Nov 17 '05 #1
Share this Question
Share on Google+
2 Replies


P: n/a
Michael Evans wrote:
Anyone,

I'm looking for a way to setup a callback function in our VC++ .net
2003 7.1 console application using a higher resolution timer. The
multimedia timer is too low resolution at 1 millisecond. We need
microsecond resolution to guarantee an exact update rate (60 or 70 Hz
for example.)

We use the Performance Counter to get microsecond resolution when
measuring elapsed time, but there doesn't appear to be any tools for
generating callbacks or interrupts, etc. at that higher resolution.

Any ideas would be greatly appreciated. Thanks,


You need to use an operating system other than Windows if you need
microsecond precision in timing - Windows was simply not designed to do
that.

Your only recourses under Windows are probably:

1. Use a multimedia timer to get close, and then poll at high frequency
until the desired event time.

2. Build (or find) a custom hardware device that generates interrupts at the
desired rate, then write a kernel device driver to service that interrupt.
The video display (if that's what you're updating) could be the source of
that interrupt stream.

If it is a display device that you need to interact with at precise times
(e.g. once per vertical blank), then Direct Draw provides frame
synchronization (which it implements using technique #1 above).

Even with the above techniques you have absolutely no guarantee that your
event will occur with any predictable latency/jitter.

-cd
Nov 17 '05 #2

P: n/a
Michael Evans wrote:
Anyone,

I'm looking for a way to setup a callback function in our VC++ .net
2003 7.1 console application using a higher resolution timer. The
multimedia timer is too low resolution at 1 millisecond. We need
microsecond resolution to guarantee an exact update rate (60 or 70 Hz
for example.)

We use the Performance Counter to get microsecond resolution when
measuring elapsed time, but there doesn't appear to be any tools for
generating callbacks or interrupts, etc. at that higher resolution.

Any ideas would be greatly appreciated. Thanks,


This isn't generally possible even with realtime operating systems.
Typically, OSes service timers at a particular resolution, such as every
1 ms, or whatever. If a timer callback or event is supposed to fire at
microsecond 123456, it will actually fire at the first kernel tick after
that time, say at microsecond 124000 (+ any kernel and scheduling latency).

The two approaches Carl mentioned are the two feasible ones. A realtime
operating system will give you almost perfect timings every time using
either (as long as priorities are set appropriately), while Windows will
give you, on average, reasonable timings (as long as priorities are set
appropriately), but with no guarantees. Remember to boost the priority
of the timing process and thread as high as possible, at least during
the critical regions (e.g. during the polling stage using technique 1).

If you are just updating a display, if you use DirectX, you shouldn't
need any timers better than 1ms accuracy. What exactly are you doing?

Tom
Nov 17 '05 #3

This discussion thread is closed

Replies have been disabled for this discussion.