Why are dates calculated from January 1st, 1970?

Question:

Is there any reason behind using date(January 1st, 1970) as default standard for time manipulation? I have seen this standard in Java as well as in Python. These two languages I am aware of. Are there other popular languages which follows the same standard?

Please describe.

Answers:

It is the standard of Unix time.

Unix time, or POSIX time, is a system for describing points in time, defined as the number of seconds elapsed since midnight proleptic Coordinated Universal Time (UTC) of January 1, 1970, not counting leap seconds.

Answered By: Bobby

Yes, C (and its family). This is where Java took it too.

Answered By: Péter Török

January 1st, 1970 00:00:00 am is the zero-point of POSIX time.

Answered By: Benjamin Bannier

Is there any reason behind using date(January 1st, 1970) as standard for time manipulation?

No reason that matters.

Python’s time module is the C library. Ask Ken Thompson why he chose that date for an epochal date. Maybe it was someone’s birthday.

Excel uses two different epochs. Any reason why different version of excel use different dates?

Except for the actual programmer, no one else will ever know why those those kinds of decisions were made.

And…

It does not matter why the date was chosen. It just was.

Astronomers use their own epochal date: http://en.wikipedia.org/wiki/Epoch_(astronomy)

Why? A date has to be chosen to make the math work out. Any random date will work.

A date far in the past avoids negative numbers for the general case.

Some of the smarter packages use the proleptic Gregorian year 1. Any reason why year 1?
There’s a reason given in books like Calendrical Calculations: it’s mathematically slightly simpler.

But if you think about it, the difference between 1/1/1 and 1/1/1970 is just 1969, a trivial mathematical offset.

Answered By: S.Lott

Q) “Why are dates calculated from January 1st, 1970?”

A) It had to be as recent as possible, yet include some past.
There was most likely no significant other reason as a lot of people feel that same way.

They knew it posed a problem if they placed it too far into the past and they knew it gave negative results if it was in the future. There was no need to go deeper in the past as events will most likely take place in the future.

Notes:
The mayans, on the other hand, had the need to place events into the past, since the had the knowledge of a lot of past, for which they made a long-term calender. Just to place all the routine phenomena on the calender.

The timestamp was not meant as a calender, it’s an Epoch. And I believe the mayans made their long-term calender using that same perspective. (meaning they knew damn well they didn’t have any relations with the past, they just had the need to see it in a bigger scale)

Answered By: Yezpahr

why its always 1st jan 1970 , Because – ‘1st January 1970’ usually called as “epoch date” is the date when the time started for Unix computers, and that timestamp is marked as ‘0’. Any time since that date is calculated based on the number of seconds elapsed. In simpler words… the timestamp of any date will be difference in seconds between that date and ‘1st January 1970’ The time stamp is just a integer which started from number ‘0’ on ‘Midnight 1st January 1970’ and goes on incrementing by ‘1’ as each second pass For conversion of UNIX timestamps to readable dates PHP and other open source languages provides built in functions.

Answered By: Friyank

using date(January 1st, 1970) as default standard

The Question makes two false assumptions:

  • All time-tracking in computing is done as a count-since-1970.
  • Such tracking is standard.

Two Dozen Epochs

Time in computing is not always tracked from the beginning of 1970 UTC. While that epoch reference is popular, various computing environments over the decades have used at least nearly two dozen epochs. Some are from other centuries. They range from year 0 (zero) to 2001.

Here are a few.

January 0, 1 BC

January 1, AD 1

October 15, 1582

January 1, 1601

December 31, 1840

November 17, 1858

December 30, 1899

December 31, 1899

January 1, 1900

January 1, 1904

December 31, 1967

January 1, 1980

January 6, 1980

January 1, 2000

January 1, 2001

Unix Epoch Common, But Not Dominant

The beginning of 1970 is popular, probably because of its use by Unix. But by no means is that dominant. For example:

  • Countless millions (billions?) of Microsoft Excel & Lotus 1-2-3 documents use January 0, 1900 (December 31, 1899).
  • The world now has over a billion iOS/OS X devices using the Cocoa (NSDate) epoch of 1 January 2001, GMT.
  • The GPS satellite navigation system uses January 6, 1980 while the European alternative Galileo uses 22 August 1999.

ISO 8601

Assuming a count-since-epoch is using the Unix epoch is opening a big vulnerability for bugs. Such a count is impossible for a human to instantly decipher, so errors or issues won’t be easily flagged when debugging and logging. Another problem is the ambiguity of granularity explained below.

I strongly suggest instead serializing date-time values as unambiguous ISO 8601 strings for data interchange rather than an integer count-since-epoch: YYYY-MM-DDTHH:MM:SS.SSSZ such as 2014-10-14T16:32:41.018Z.

Count Of What Since Epoch

Another issue with count-since-epoch time tracking is the time unit, with at least four levels of resolution commonly used.

Diagram showing different software counting from epoch by seconds, milliseconds, microseconds, or nanoseconds.

Answered By: Basil Bourque