How to Subtract Two Gettimeofday Instances

How to subtract two gettimeofday instances?

You can use the timersub() function provided by glibc, then convert the result to milliseconds (watch out for overflows when doing this, though!).

Difference of two gettimeofday() calls gives negative number

tv_usec gives the number of microseconds within the current second. When the time accumulates to a full second, tv_sec increases, and tv_usec restarts from zero.

When you subtract a number shortly after the restart from a number shortly before the restart, the result is negative.

C - gettimeofday for computing time?

Your curtime variable holds the number of seconds since the epoch. If you get one before and one after, the later one minus the earlier one is the elapsed time in seconds. You can subtract time_t values just fine.

Trying to use gettimeofday to find elapsed time in milliseconds

Printf with %ld is expecting an integer instead of a float. Try %f or similar instead of %ld. Also, you might want to check for (and deal with) wraparound on the microseconds part.

Perl - Calculate difference between two timestamps

I recommend that you use the Time::HiRes::Value module which supports arithmetic on objects that contain separate second and microsecond values — the same as the value returned by Time::HiRes::gettimeofday. It also provides a convenient stringification overload

You would need to write a simple parsing subroutine though, like this

use strict; 
use warnings;

use Time::HiRes::Value;

my $t0 = parse_time('15:45:30.125');
my $t1 = parse_time('18:12:46.886');

print $t1 - $t0;

sub parse_time {
my ($h, $m, $s, $ms) = shift =~ /\d+/g;
Time::HiRes::Value->new(($h * 60 + $m) * 60 + $s, $ms * 1000);
}

output

8836.761000

elapsed time in negative value

maybe something along the lines of:

long t = (end.tv_sec*1e6 + end.tv_usec) - (start.tv_sec*1e6 + start.tv_usec);

diffrence of two time in c++

Use difftime:

double diff = difftime(t2, t1);

This gives you the difference in seconds. Multiply diff by 1000 to get milliseconds.

What is the precision of the gettimeofday function?

The average microseconds passed between consecutive calls to gettimeofday is usually less than one - on my machine it is somewhere between 0.05 and 0.15.

Modern CPUs usually run at GHz speeds - i.e. billions of instructions per second, and so two consecutive instructions should take on the order of nanoseconds, not microseconds (obviously two calls to a function like gettimeofday is more complex than two simple opcodes, but it should still take on the order of tens of nanoseconds and not more).

But you are performing a division of ints - dividing (current_time[MAX_TIMES - 1].tv_usec - current_time[0].tv_usec) by MAX_TIMES - which in C will return an int as well, in this case 0.


To get the real measurement, divide by (double)MAX_TIMES (and print the result as a double):

printf("the average time of a gettimeofday function call is: %f us\n", (current_time[MAX_TIMES - 1].tv_usec - current_time[0].tv_usec) / (double)MAX_TIMES);

As a bonus - on Linux systems the reason gettimeofday is so fast (you might imagine it to be a more complex function, calling into the kernel and incurring the overhead of a syscall) is thanks to a special feature called vdso which lets the kernel provide information to user space without going through the kernel at all.

Time difference in C++

You have to use one of the more specific time structures, either timeval (microsecond-resolution) or timespec (nanosecond-resolution), but you can do it manually fairly easily:

#include <time.h>

int diff_ms(timeval t1, timeval t2)
{
return (((t1.tv_sec - t2.tv_sec) * 1000000) +
(t1.tv_usec - t2.tv_usec))/1000;
}

This obviously has some problems with integer overflow if the difference in times is really large (or if you have 16-bit ints), but that's probably not a common case.



Related Topics



Leave a reply



Submit