Unix Timestamp Always in Gmt

UNIX timestamp always in GMT?

yep, UNIX timestamp represents how much seconds past from unix-time epoch in GMT+0

Do UNIX timestamps change across timezones?

The definition of UNIX timestamp is time zone independent. The UNIX timestamp is the number of seconds (or milliseconds) elapsed since an absolute point in time, midnight of Jan 1 1970 in UTC time. (UTC is Greenwich Mean Time without Daylight Savings time adjustments.)
Regardless of your time zone, the UNIX timestamp represents a moment that is the same everywhere. Of course you can convert back and forth to a local time zone representation (time 1397484936 is such-and-such local time in New York, or some other local time in Djakarta) if you want.

The article at http://en.wikipedia.org/wiki/Unix_time is pretty impressive if you'd like a longer read.

Does PHP time() return a GMT/UTC Timestamp?

time returns a UNIX timestamp, which is timezone independent. Since a UNIX timestamp denotes the seconds since 1970 UTC you could say it's UTC, but it really has no timezone.


To be really clear, a UNIX timestamp is the same value all over the world at any given time. At the time of writing it's 1296096875 in Tokyo, London and New York. To convert this into a "human readable" time, you need to specify which timezone you want to display it in. 1296096875 in Tokyo is 2011-01-27 11:54:35, in London it's 2011-01-27 02:54:35 and in New York it's 2011-01-26 21:54:35.

In effect you're usually dealing with (a mix of) these concepts when handling times:

  • absolute points in time, which I like to refer to as points in human history
  • local time, which I like to refer to as wall clock time
  • complete timestamps in any format which express an absolute point in human history
  • incomplete local wall clock time

Visualise time like this:

-------+-------------------+-------+--------+----------------+------>
| | | | |
Dinosaurs died Jesus born Y2K Mars colonised ???

(not to scale)

An absolute point on this line can be expressed as:

  • 1296096875
  • Jan. 27 2011 02:54:35 Europe/London

Both formats express the same absolute point in time in different notations. The former is a simple counter which started roughly here:

                          start of UNIX epoch
|
-------+-------------------+------++--------+----------------+------>
| | | | |
Dinosaurs died Jesus born Y2K Mars colonised ???

The latter is a much more complicated but equally valid and expressive counter which started roughly here:

              start of Gregorian calendar
|
-------+-------------------+-------+--------+----------------+------>
| | | | |
Dinosaurs died Jesus born Y2K Mars colonised ???

UNIX timestamps are simple. They're a counter which started at one specific point in time and which keeps increasing by 1 every second (for the official definition of what a second is). Imagine someone in London started a stopwatch at midnight Jan 1st 1970, which is still running. That's more or less what a UNIX timestamp is. Everybody uses the same value of that one stopwatch.

Human readable wall clock time is more complicated, and it's even more complicated by the fact that it's abbreviated and parts of it omitted in daily use. 02:54:35 means almost nothing on the timeline pictured above. Jan. 27 2011 02:54:35 is already a lot more specific, but could still mean a variety of different points on this line. "When the clock struck 02:54:35 on Jan. 27 2011 in London, Europe" is now finally an unambiguous absolute point on this line, because there's only one point in time at which this was true.

So, timezones are a "modifier" of "wall clock times" which are necessary to express a unique, absolute point in time using a calendar and hour/minute/second notation. Without a timezone a timestamp in such a format is ambiguous, because the clock struck 02:54:35 on Jan. 27 2011 in every country around the globe at different times.

A UNIX timestamp inherently does not have this problem.


To convert from a UNIX timestamp to a human readable wall clock time, you need to specify which timezone you'd like the time displayed in. To convert from wall clock time to a UNIX timestamp, you need to know which timezone that wall clock time is supposed to be in. You either have to include the timezone every single time with each such conversion, or you set the default timezone to be used with date_default_timezone_set.

why Unix Time Stamp for same time is different in different timezone

For example if I(+5:30 GMT) and My friend(+5:00 GMT) starts counting the ticks from 00:00 Hrs respectively so at 18:00 Hrs in both timezone number of ticks should be same.

No, because both of you start counting from 00:00 UTC. That's the definition. So for you, that will mean the number of ticks since 18:30, and for your friend it will mean the number of ticks since 19:00.

The idea is that a single instant in time has the same timestamp value everywhere. So if I were calling you now (and ignoring phone delays) we could both agree that "now" is a Unix timestamp of 1374130418. You may have a different local time to me, but we can express "now" in a common format.

See the "core concepts" part of the Noda Time user guide for more discussion of local time vs "global" time.

Need to receive local UNIX Timestamp using Moment.js rather than UTC Timestamp

There's no such thing as a "local Unix timestamp". A Unix timestamp measures the number of seconds1 since the Unix epoch, which is defined as midnight on January 1st 1970 UTC.

This means that at any given instant in time, the current Unix timestamp is the same everywhere in the world. If people from two different countries were having a conversation, they'd both agree on "the current Unix timestamp" even if their watches showed different times.

So result b in your question is already correct: when it was 11:45am on April 11th 2020 in India, the Unix timestamp was 1586585700. Anything expecting 1586605500 isn't expecting a Unix timestamp. Rather than try to provide the result that that code expects, I would to correct that code to expect a Unix timestamp - or get it to expect a local value in some other form.


1 At least traditionally. The idea of "a Unix timestamp but in milliseconds, or microseconds, or nanoseconds" makes perfect sense, and doesn't change any of the rest of this answer.

Why do a timezone-unaware and a timezone-aware datetime object with a different timezone yield the same unix-timestamp?

UNIX timestamps have no timezone, they always express the number of seconds elapsed since Jan. 1st 1970 00:00 UTC. That number is the same globally, it doesn't change with your timezone.

Naive datetime instances are assumed to represent local time and [timestamp] relies on the platform C mktime() function to perform the conversion.

https://docs.python.org/3/library/datetime.html#datetime.datetime.timestamp

So, if you are in Europe/Berlin, then a naïve datetime and a datetime localised to Europe/Berlin are interpreted the same way when converted to a timestamp. Try localising to other timezones, which means that 2020, 4, 1, 0, 0, 0 actually refers to a different time, and you'll see different timestamps as well.

Are unix timestamps the best way to store timestamps?

However you choose to store a timestamp, it is important to avoid regional interpretation problems and time offset problems. A Unix timestamp is interpreted the same regardless of region, and is calculated from the same point in time regardless of time zone - these are good things.

Beware storing timestamps as ambiguous strings such as 01/02/2008, since that can be interpreted as January 02, 2008 or February 01, 2008, depending on locale.

When storing hours/minutes/seconds, it is important to know "which" hour/minute/second is being specified. You can do this by including timezone information (not needed for a Unix timestamp, since it is assumed to be UTC).

However, note that Unix timestamps cannot uniquely represent some instants in time: when there is a leap second in UTC, the Unix timestamp does not change, so both 23:59:60 UTC and 00:00:00 the next day have the same Unix representation. So if you really need one second or better resolution, consider another format.

If you prefer a more human readable format for storage than a Unix timestamp, consider ISO 8601.

One technique that helps keep things straight-forward is to store dates as UTC and only apply timezone or DST offsets when displaying a date to a user.



Related Topics



Leave a reply



Submit