Are we completely fucked in the Year 2038 problem?

Probably not, makes for neat stickers for computer autists to put on their aging pre-intel ME/AMD PSP anti-glownigger spyware machines though
Capture.PNG
 
Hopefully I will be dead before I need to upgrade my computer again so not my problem
 
  • Feels
Reactions: Pope Julius IV
The Year 2038 problem will either hinder or destroy the plannings of what 2050 will have to offer regarding technology, infrastructure, politics and education. Even the A.I. in that time period will not be as advanced enough to stop the inevitable issues that 2038 will face, let alone what 2050 has to offer.
 
On January 19th, 2038, this number will be too big to store on 32-bit computers.
What are the consequences? How many are 32 bit? What can be fine to fix it?
Y2K was supposed to kill us all but the only terrible thing that happened was us all having a hangover in the morning.
 
  • Agree
Reactions: Anonitolia
By the time 2038 happens. I most likely wouldn't be using a computer or the internet because it would be way trooned out at this point. I would most likely be a farmer in Mexico trying to fend off the cartel. But if it does happen. The Sneed would be something to look forward to.
 
What are the consequences? How many are 32 bit? What can be fine to fix it?
Y2K was supposed to kill us all but the only terrible thing that happened was us all having a hangover in the morning.
As with last time, a currently unknown subset of all the world's software will malfunction. The solution is to identify and modify the affected systems before that time.

The reason you probably weren't affected by Y2K is because we successfully identified and modified all the important stuff in time (at great expense).
 
What are the consequences? How many are 32 bit? What can be fine to fix it?
Y2K was supposed to kill us all but the only terrible thing that happened was us all having a hangover in the morning.
Unix time is the number of seconds since Jan 1, 1970 at midnight. For example, if the timestamp said 31,536,000, that would be Jan 1, 1971, at midnight (I googled number of seconds in a year, this could be off by a bit lol). The programmer (usually done at the API level) would take the number of seconds, add it to Jan 1, 1970, and that is the time. This is stored in a 32 bit integer, which can store values between -2.14 billion and 2.14 billion. If you have it at the max, and it goes 1 higher, it will roll over and suddenly become negative 2.14 billion. So the time calculation would then become Jan 1, 1970 - 2.14 billion seconds. This is obviously very, very bad.

The fix is easy but requires pretty much a revamp of everything. The fix would be to convert it to 64 or 128 bit integers, which will last until the sun burns out and longer. However, because the data size is different, this would require a complete revamp of the entire o/s because now the timestamp is bigger and there's a lot of memory allocations which assume the size is only 32 bit.

It was similar for Y2K, except with Y2K the problem was identified and fixed. Thankfully testing it is pretty damn easy at an individual software level, but testing it at an entire enterprise level is not so much since it's often easy to miss modules. The downside is that this is a bigger problem due to the fact that more systems use Unix time than they did the time that was an issue with y2k. Y2k could've been a huge issue if the programmers didn't fix it

The good thing is that people expect it and we have 15 years to do it, and systems have already begun patching for it. For example, if a bank patches it and suddenly someone shows up as not even born yet, the bank would be aware of the patch and realize it'd be likely due to the patch being hosed.
 
Personally I look forward to computers going back in time to 1901. Some old systems that aren't overhauled by 2038 might have issues though. A few fatal errors may occur.
 
Back