C# Datetime.Now Precision

C# DateTime.Now precision

Why would DateTime.Now be made less precise than what most CPU clocks could handle?

A good clock should be both precise and accurate; those are different. As the old joke goes, a stopped clock is exactly accurate twice a day, a clock a minute slow is never accurate at any time. But the clock a minute slow is always precise to the nearest minute, whereas a stopped clock has no useful precision at all.

Why should the DateTime be precise to, say a microsecond when it cannot possibly be accurate to the microsecond? Most people do not have any source for official time signals that are accurate to the microsecond. Therefore giving six digits after the decimal place of precision, the last five of which are garbage would be lying.

Remember, the purpose of DateTime is to represent a date and time. High-precision timings is not at all the purpose of DateTime; as you note, that's the purpose of StopWatch. The purpose of DateTime is to represent a date and time for purposes like displaying the current time to the user, computing the number of days until next Tuesday, and so on.

In short, "what time is it?" and "how long did that take?" are completely different questions; don't use a tool designed to answer one question to answer the other.

Thanks for the question; this will make a good blog article! :-)

DateTime precision in .NET core

The explanation would be, that dot net asks the underlying operating system for the current time. The operating system asks the underlying hardware. In ancient times the hardware clock (RTC) on the motherboard used to update itself once in about 15 milli seconds.
That number was derived from the 60Hz AC line frequency in US which the power grid maintained sufficient accurately. Remember those were the days of "slow" computers and designers tried to squeeze in every bit of performance they could. So the OS did not consult RTC everytime someone asked for time and passed a cached copy of the value - which is updated very infrequently.

Somewhere down the line, the motherboard evolved and the RTCs became more precise. But the OS and all the things on top of it did not feel the need for it. Remember h/w evolved far faster than software and even till day, consumer grade software waste a large fraction of raw h/w capability. So when dot net framework asked OS for time, it got back the imprecise data even when the h/w was capable. The accuracy did evolve from 15ms to below 1ms, but that was it.

Come windows 8 (server 2012), it was finally realized that (1) applications could do better with more precise time (2) computers are fast so consulting RTC everytime is no longer a problem (3) a large population of programmers and programs are used to and actually rely on the imprecise time behavior. So they (win 8) went on to introduce a new marginally slower mechanism to obtain the most precise time data, but left the original implementation unchanged.

Dot net had always used the older and imprecise OS function GetSystemTimeAsFileTime and when a new cousin GetSystemTimePreciseAsFileTime appeared in win 8, dot net chose to go the backward compatible way and did nothing.

Dot net core is a fresh rewrite of many core features and now leverages the high precision data source.

Edit

If the current time is 13:14:15:123456 , there is still no guarantee that the real true time, as seen by the physicists and astronomers is that. Your computer is not an atomic clock. And certainly not a well synchronized clock. The only thing it means is that if two events happened at different timestamps then one event happened certainly before the another. In older computers, rate of generation of events (ex logs, files, database txns etc) was lower and so there was low chance that sequential events would be assigned same timestamps. This new time system caters to modern high rate activities so that you can mark sequential events as different. Still for two very close events, there will always be a chance of same timestamp. That is eventually unavoidable. If you need nanosecond level measurement (why) you need different tools like Stopwatch and not System.DateTime.

How to get precise DateTime in C#

You can use Stopwatch for more precise measurements of time. You can then have each log entry record the time from the start of the first operation. If it's important, you can record the DateTime of the first operation and therefore calculate the times of the rest, but it sounds like just having the ticks/nanoseconds since the start of the first operation is good enough for your purposes.

DateTime precision?

DateTime.Now calls internally :

public static DateTime Now
{
get
{
return DateTime.UtcNow.ToLocalTime();
}
}

which calls internally:

public static DateTime UtcNow
{
get
{
long systemTimeAsFileTime = DateTime.GetSystemTimeAsFileTime();
return new DateTime((ulong)(systemTimeAsFileTime + 504911232000000000L | 4611686018427387904L));
}
}

where GetSystemTimeAsFile is WindowsAPI function that return system clock information. The accurasy depends on system, so.

If you have a delay, for some reason between different gets (DateTime.Now ) it may produce different enough result that equality comparer fails. But I, personally, never met this kind of condition in my experience.

.NET datetime Millisecond precision issue when converting from string to datetime

According to Custom Date and Time Format Strings docs, 7 is maximum supported digits of second fraction.



Related Topics



Leave a reply



Submit