What are the consequences? How many are 32 bit? What can be fine to fix it?
Y2K was supposed to kill us all but the only terrible thing that happened was us all having a hangover in the morning.
Unix time is the number of seconds since Jan 1, 1970 at midnight. For example, if the timestamp said 31,536,000, that would be Jan 1, 1971, at midnight (I googled number of seconds in a year, this could be off by a bit lol). The programmer (usually done at the API level) would take the number of seconds, add it to Jan 1, 1970, and that is the time. This is stored in a 32 bit integer, which can store values between -2.14 billion and 2.14 billion. If you have it at the max, and it goes 1 higher, it will roll over and suddenly become negative 2.14 billion. So the time calculation would then become Jan 1, 1970 - 2.14 billion seconds. This is obviously very, very bad.
The fix is easy but requires pretty much a revamp of everything. The fix would be to convert it to 64 or 128 bit integers, which will last until the sun burns out and longer. However, because the data size is different, this would require a complete revamp of the entire o/s because now the timestamp is bigger and there's a lot of memory allocations which assume the size is only 32 bit.
It was similar for Y2K, except with Y2K the problem was identified and fixed. Thankfully testing it is pretty damn easy at an individual software level, but testing it at an entire enterprise level is not so much since it's often easy to miss modules. The downside is that this is a bigger problem due to the fact that more systems use Unix time than they did the time that was an issue with y2k. Y2k could've been a huge issue if the programmers didn't fix it
The good thing is that people expect it and we have 15 years to do it, and systems have already begun patching for it. For example, if a bank patches it and suddenly someone shows up as not even born yet, the bank would be aware of the patch and realize it'd be likely due to the patch being hosed.