Fastest Timing Resolution System

Fastest timing resolution system

For timing, the current Microsoft recommendation is to use QueryPerformanceCounter & QueryPerformanceFrequency.

This will give you better-than-millisecond timing. If the system doesn't support a high-resolution timer, then it will default to milliseconds (the same as GetTickCount).

Here is a short Microsoft article with examples of why you should use it :)

Best timing method in C?

I think this should work:

#include <time.h>

clock_t start = clock(), diff;
ProcessIntenseFunction();
diff = clock() - start;

int msec = diff * 1000 / CLOCKS_PER_SEC;
printf("Time taken %d seconds %d milliseconds", msec/1000, msec%1000);

System.currentTimeMillis vs System.nanoTime

If you're just looking for extremely precise measurements of elapsed time, use System.nanoTime(). System.currentTimeMillis() will give you the most accurate possible elapsed time in milliseconds since the epoch, but System.nanoTime() gives you a nanosecond-precise time, relative to some arbitrary point.

From the Java Documentation:

public static long nanoTime()

Returns the current value of the most precise available system timer, in nanoseconds.

This method can only be used to
measure elapsed time and is not
related to any other notion of system
or wall-clock time. The value returned
represents nanoseconds since some
fixed but arbitrary origin time (perhaps in
the future, so values may be
negative). This method provides
nanosecond precision, but not
necessarily nanosecond accuracy. No
guarantees are made about how
frequently values change. Differences
in successive calls that span greater
than approximately 292 years (263
nanoseconds) will not accurately
compute elapsed time due to numerical
overflow.

For example, to measure how long some code takes to execute:

long startTime = System.nanoTime();    
// ... the code being measured ...
long estimatedTime = System.nanoTime() - startTime;

See also: JavaDoc System.nanoTime() and JavaDoc System.currentTimeMillis() for more info.

High resolution timer

I found a solution to this problem in the following blog:
http://web.archive.org/web/20110910100053/http://www.indigo79.net/archives/27#comment-255

It tells you how to use the multimedia timer to have a timer with high frequency. It is working just fine for me!!!

Windows C++ nanosecond timing?

Use the QueryPerformanceFrequency function to see what speed the QueryPerformanceCounter runs at. I think it might be in the nanosecond range.

Is there a more accurate way to create a Javascript timer than setTimeout?

Are there any tricks that can be done
to ensure that setTimeout() performs
accurately (without resorting to an
external API) or is this a lost cause?

No and no. You're not going to get anything close to a perfectly accurate timer with setTimeout() - browsers aren't set up for that. However, you don't need to rely on it for timing things either. Most animation libraries figured this out years ago: you set up a callback with setTimeout(), but determine what needs to be done based on the value of (new Date()).milliseconds (or equivalent). This allows you to take advantage of more reliable timer support in newer browsers, while still behaving appropriately on older browsers.

It also allows you to avoid using too many timers! This is important: each timer is a callback. Each callback executes JS code. While JS code is executing, browser events - including other callbacks - are delayed or dropped. When the callback finishes, additional callbacks must compete with other browser events for a chance to execute. Therefore, one timer that handles all pending tasks for that interval will perform better than two timers with coinciding intervals, and (for short timeouts) better than two timers with overlapping timeouts!

Summary: stop using setTimeout() to implement "one timer / one task" designs, and use the real-time clock to smooth out UI actions.



Related Topics



Leave a reply



Submit