Redpill me on the Y2K scare. Why wasn't it as bad as everyone thought? Were there actual problems? Wouldn't servers do most of their comparisons with Unix time, thereby bypassing the problem? Does anyone actually do time comparisons using a YY:mm:dd format? I would assume that most data structures implementing this human readable form would be backed by unix time.
Y2k
Most bank software was made in the 80s so it was super legacy. Y2k was a meme at best
RTC in CMOS is a bit fucky, the year range is 0-99 and century register doesn't have quite specified behavior and only being introduced in late 90s
maybe it was significant, dunno
>Why wasn't it as bad as everyone thought?
Because people spent more than a decade fixing up the legacy shit from the 1960s to 1980s
>unix time overflows in 2037
No, in 2038.
Could you have addressed the issue regarding legacy machines that do specific things without the 'net by setting them up with a modem to dial out and sync with ACTS every day at X time? nist.gov
this. also y2k was the first dev bubble where everyone and their dog learned programming because there was a lot money to be made. shortly afterwards the dotcom boom happened
2137 :D
"unix time" is the number of seconds since midnight, 1970-01-01. How that integer is stored in the backend, is implementation specific (though, yes, it's a little constrained at the library/syscall level thanks to POSIX). Anyways, year 2038 is only a problem if the time is stored as a 32-bit integer. The major 'nix's have already been patched for this.
why didn't they simply go with an unsigned integer?
>Y2K was a meme
This, faggot
One of my tutors at college made a fucking fortune fixing COBOL payroll software for various businesses in 1998
I'd guess they chose signed integers at the API level to allow for dates prior to 1970 to be expressed. This might be useful for things such as file modification times, or just simple arithmetic. The kernel itself most likely stores time as an unsigned integer tho
Depends on the thingy in question. Most embedded stuff didn't give a fuck about the year. But most financial stuff only used 2-digit years, and was written in COBOL or another legacy language.
Usually they """fixed""" that by changing the wraparound for the year, like making it 20-19 for 1920-2019. Oh, wait...
>Why wasn't it as bad as everyone thought?
Because most things didn't have a problem to begin with, and of the things that did, people fixed their shit in time. I was working an embedded thingy in assembly language in 1998 when I noticed it didn't do the 400 years leap year thing for 2000. So I fixed it.
Fortunately someone else tested it in 1999, because I got the branch backwards, and all leap years were wrong. Young me was a bit sloppy about testing.
And there were even freaking hardware calendar problems, though they didn't necessarily show in 2000. The PS3 clock chip did leap years by checking if the _BCD_ year was divisible by four. Of course 10 is divisible by four! I don't think so, Tim. That's why I prefer clock chips which are a simple 32-bit seconds counter. It's harder to patch a hardware bug.
Typical women logic
>tens of thousands of engineers and coders work day in and out to avoid a serious problem
>See everything turned out fine why did anyone worry? xD
Because you might still need to reference dates before 1970.
The two main ways were add another 32 bits to the left (wasteful because way too many bits), or the NeXT way and redefine it as a double float, then you can do milliseconds or microseconds for delta time, or you can still have sub-second accuracy for way in the future.
>only 2 abortions were carried out
WHY
>Y2k was a meme at best
For consumer software, thats /mostly/ true, but was absolutely not the case for the millions of lines of mainframe and mini computer code (Some places still used VAX in the late 90s) in production use by banks, businesses, research, and governments. Literally billions of dollars were spent collectively to correct code and was a enormous effort, but it was largely behind the scenes.
but it won't have a catchy cyberpunk name, so normies won't give a shit
...because negative timestamps are valid (times < 1969)
based
The biggest Y2K problem was 2-digit years in databases and in the code that processed them. Rolling the date window forward only works if there isn't already existing data from before the start of the new window, especially if you're too pussy to update your code for 4-digit years at the same time.