Why Is SQL Server Losing a Millisecond

Why is SQL Server losing a millisecond?

SQL Server only stores time to approximately 1/300th of a second. These always fall on the 0, 3 and 7 milliseconds. E.g. counting up from 0 in the smallest increment:

00:00:00.000

00:00:00.003

00:00:00.007

00:00:00.010

00:00:00.013

...

If you need that millisecond accuracy, there's no pleasant way around it. The best options I've seen are to store the value in custom number fields and rebuild it every time you fetch the value, or to store it as a string of a known format. You can then (optionally) store an 'approximate' date in the native date type for the sake of speed, but it introduces a conceptual complexity that often isn't wanted.

MS SQL Server is dropping 1 millisecond from converted time

That is how datetime works in sql server. If you want that millisecond, switch to datetime2([3-7]).

datetime accuracy is to 0.00333 second.

datetime2 accuracy is to 100 nanoseconds.

Date and Time Data Types and Functions (Transact-SQL)

Similarly, if you want to get the server time with additional accuracy, you would use sysdatetime() which returns datetime2(7) instead of getdate() which returns datetime.

Milliseconds wrong when converting from XML to SQL Server datetime

Yes, SQL Server rounds time to 3.(3) milliseconds:

SELECT CAST(CAST('2009-01-01 00:00:00.000' AS DATETIME) AS BINARY(8))
SELECT CAST(CAST('2009-01-01 00:00:01.000' AS DATETIME) AS BINARY(8))

0x00009B8400000000
0x00009B840000012C

As you can see, these DATETIME's differ by 1 second, and their binary representations differ by 0x12C, that is 300 in decimal.

This is because SQL Server stores the time part of the DATETIME as a number of 1/300 second ticks from the midnight.

If you want more precision, you need to store a TIME part as a separate value. Like, store time rounded to a second as a DATETIME, and milliseconds or whatever precision you need as an INTEGER in another columns.

This will let you use complex DATETIME arithmetics, like adding months or finding week days on DATETIME's, and you can just add or substract the milliseconds and concatenate the result as .XXXXXX+HH:MM to get valid XML representation.

Why is SQL Server losing a millisecond?

SQL Server only stores time to approximately 1/300th of a second. These always fall on the 0, 3 and 7 milliseconds. E.g. counting up from 0 in the smallest increment:

00:00:00.000

00:00:00.003

00:00:00.007

00:00:00.010

00:00:00.013

...

If you need that millisecond accuracy, there's no pleasant way around it. The best options I've seen are to store the value in custom number fields and rebuild it every time you fetch the value, or to store it as a string of a known format. You can then (optionally) store an 'approximate' date in the native date type for the sake of speed, but it introduces a conceptual complexity that often isn't wanted.

How can I account for datetime's lack of millisecond precision when performing datetime comparisons?

According to the documentation, datetime is "rounded to increments of .000, .003, or .007 seconds." So the c# workaround is to round up the DateTime input to the next number, following that pattern.

Something like this:

private DateTime RoundUp(DateTime datetime)
{
int lastMillisecondDigit;
do
{
datetime = datetime.AddMilliseconds(1);
lastMillisecondDigit = datetime.Millisecond % 10;
} while (lastMillisecondDigit != 0 && lastMillisecondDigit != 3 && lastMillisecondDigit != 7);

return datetime;
}

Note: I understand this is a big hack. There are better solutions in the comments, such as changing the datatype to datetime2. But if you are unable to change the database or code, I think this workaround will do the trick.

Weird datetime and dateime2 millisecond comparison issue

The issue is that SQL Server is implicitly casting to make the comparison and that is changing the values. Explicitly cast to Datetime and you should get the results you are expecting.

This helps show what is happening behind the scenes now that is causing unexpected results:

declare @dt2 datetime2(7) = '2018-06-25 16:46:38.9930000'
declare @dt datetime
set @dt = @dt2

SELECT
@dt2 AS [Datetime2 value]
, @dt AS [Datetime value]
, CONVERT(DATETIME2,@dt) AS [Datetime converted to Datetime2]
, CONVERT(DATETIME2,@dt2) AS [Datetime2 converted to Datetime2]
, CONVERT(DATETIME,@dt) AS [Datetime converted to Datetime]
, CONVERT(DATETIME,@dt2) AS [Datetime2 converted to Datetime]

Sample Image

Inserting datetime with milliseconds into SQL Server table issue

According to MSDN https://msdn.microsoft.com/en-CA/library/ms187819.aspx, the accuracy of [DateTime] data type is "Rounded to increments of .000, .003, or .007 seconds"

To solve your issue, use DateTime2(3) as shown below:

CREATE TABLE #table
(
DTstamp DATETIME2(3) NOT NULL
)
INSERT INTO #table VALUES ('1 apr 2016 15:01:02:129')
SELECT DTstamp FROM #table
DROP TABLE #table

SQL Server remove milliseconds from datetime

You just have to figure out the millisecond part of the date and subtract it out before comparison, like this:

select * 
from table
where DATEADD(ms, -DATEPART(ms, date), date) > '2010-07-20 03:21:52'


Related Topics



Leave a reply



Submit