Millisecond Resolution of Datetime in Ruby

Millisecond resolution of DateTime in Ruby

Changing m.happened_at = '2012-01-01T00:00:00.32323'.to_datetime in the code above to m.happened_at = '2012-01-01T00:00:00.32323' solves the problem, though I have no idea why.

How do I get elapsed time in milliseconds in Ruby?

As stated already, you can operate on Time objects as if they were numeric (or floating point) values. These operations result in second resolution which can easily be converted.

For example:

def time_diff_milli(start, finish)
(finish - start) * 1000.0
end

t1 = Time.now
# arbitrary elapsed time
t2 = Time.now

msecs = time_diff_milli t1, t2

You will need to decide whether to truncate that or not.

How to time an operation in milliseconds in Ruby?

You can use ruby's Time class. For example:

t1 = Time.now
# processing...
t2 = Time.now
delta = t2 - t1 # in seconds

Now, delta is a float object and you can get as fine grain a result as the class will provide.

XML Serialization is not including milliseconds in datetime field from Rails model

As it turns out, I can exclude the original datetime field, and add a custom method which in turn renders the datetime as a string to the to_xml. This feels hackish, but it's working.. Is there another way to get milliseconds directly in the original datetime field?

In each model, I exclude "except" the field names that have datetimes that I want changed, and I include "methods" with the same name returning the attribute before it is typecasted.

  def to_xml(options = {})
options[:methods] = [:some_datetime]
options[:except] = [:some_datetime]
super
end

def some_datetime
attribute_before_type_cast('some_datetime')
end

Rendering to_xml is working great with models included and any other options I pass in.

When creating objects, created_at stores microsecond precision that json can't match

Ultimately, I decided on changing the datetime columns in my DB.

I basically applied the following wherever relevant in a migration:

change_column table, column, :datetime, limit: 3

That successfully makes sure that my :datetime columns all only go to the milliseconds.

I think it is a good change because when the sole purpose of my back-end is to render a JSON API, then I think through and through it should be cohesive. JSON API standards are ISO 8601. As such, I'd like my DB to adhere to that.

how to extract year, tz-convert and get millisecond part from a nanotime timestamp?

I do all of this with data.table because it is known that data.table supports the underlying bit64 package and integer64 representation that is needed here. Other containers do not.

Code

library(nanotime)
library(data.table)

DT <- data.table(ts = c(nanotime('2011-12-05 08:30:00.000',format ="%Y-%m-%d %H:%M:%E9S", tz ="GMT"),
nanotime('2011-12-05 08:30:00.700',format ="%Y-%m-%d %H:%M:%E9S", tz ="GMT"),
nanotime('2011-12-05 08:30:00.825',format ="%Y-%m-%d %H:%M:%E9S", tz ="GMT")))
DT[, pt := as.POSIXct(ts)]
DT[, millis := as.numeric(pt - trunc(pt)) * 1e3]

Result

R> DT
ts pt millis
1: 2011-12-05T08:30:00.000000000+00:00 2011-12-05 02:30:00.000 0
2: 2011-12-05T08:30:00.700000000+00:00 2011-12-05 02:30:00.700 700
3: 2011-12-05T08:30:00.825000000+00:00 2011-12-05 02:30:00.825 825
R>

Timezone shifts are a different (and misunderstood) topic. You can do it to POSIXct.

Note that all you have done here / asked for here was the millisecond resolution. So far no need was demonstrated for nanotime. But what I showed you can work on nanoseconds -- I use it every day from data.table.

How can I convert a datetime object to milliseconds since epoch (unix time) in Python?

It appears to me that the simplest way to do this is

import datetime

epoch = datetime.datetime.utcfromtimestamp(0)

def unix_time_millis(dt):
return (dt - epoch).total_seconds() * 1000.0


Related Topics



Leave a reply



Submit