Epoch time was chosen by programmers as a matter of convenience.
Think of it as an anchor point from which we can measure time. Just like you count up from 0, so you always know what “100” means. In order for epoch time to work, programmers had to agree on common “zero date”.
1970-01-01 was chosen, iirc, sometime after that date, because it was close enough to work and epoch milliseconds are extremely useful for recording dates. This way you can boil the date down to a simple number and always know what time it is referring to.
It allows you to ignore things like am/pm, time zones, and the like.
I don’t think there was much malice or subtext in the epoch that was chosen. It was intentionally picked as a reasonable “zero” — the zeroth minute, hour, and day on a “0” year in the “zero” timezone (Greenwich mean time).
Yeah, it's kinda like that. Fun fact: the maximum date you can represent with a 32-bit number is 19 January 2038 (at 03:14:07 UTC). So we're going to have a mini y2k where some software will need to be rewritten in order to keep tracking time. Luckily 64-bit is widely adapted so most computers will be able to keep up for millennia.
Epoch time was chosen by programmers as a matter of convenience.
Think of it as an anchor point from which we can measure time. Just like you count up from 0, so you always know what “100” means. In order for epoch time to work, programmers had to agree on common “zero date”.
1970-01-01 was chosen, iirc, sometime after that date, because it was close enough to work and epoch milliseconds are extremely useful for recording dates. This way you can boil the date down to a simple number and always know what time it is referring to.
It allows you to ignore things like am/pm, time zones, and the like.
I don’t think there was much malice or subtext in the epoch that was chosen. It was intentionally picked as a reasonable “zero” — the zeroth minute, hour, and day on a “0” year in the “zero” timezone (Greenwich mean time).
Interesting. Thanks. So it’s a programmers/cpu’s version of bc/ad kind of?
Yeah, it's kinda like that. Fun fact: the maximum date you can represent with a 32-bit number is 19 January 2038 (at 03:14:07 UTC). So we're going to have a mini y2k where some software will need to be rewritten in order to keep tracking time. Luckily 64-bit is widely adapted so most computers will be able to keep up for millennia.