- Joined
- Apr 20, 2021
One of the most widespread uses of technology is any kind of floating point arithmetic stored in many different kinds of bits. The most common being that of Single (32-bit) and Double (64-bit) floating point precision formats with Quadruple (128-bit) ones being rarely used and not a single hardware using Octuple (256-bit) whatsoever. These formats dictate how computers function IRL for positions, score, statistics, mathematics, timekeeping, etc. However are limited to a series of numbers (usually a power of 2) before overflowing.
In January 19, 2038 for example (A time eurofags and niggers would sleep through as it happens at 3AM for them), 32-bit computers would enter a timekeeping overflow for the first time as it's epoch is set to January 1, 1970 (setting a time date like that bricks some phones by the way). The Unix surpasses 2,147,483,647 which is the signed 32-bit limit. For these computers, it would roll over to -2,147,483,648 or Year 1903. For 64-bit computers, the first overflow won't happen for billions of years.
The absolute limit 32-bit computers can calculate is close to 10^308 where it rolls over to Infinity or NaN. Desmos is a powerful calculator that takes liberty in Calculating large numbers but can never go past that threshold. You can zoom out as far as 10^300, once it goes that far. You can't zoom out any further as doing so may cause the app to crash or behave erratically.
It is widely understood in video games such as Minecraft that going out far enough causes the game to act more strange. Hence the famous artifact known as the Farlands that would generate at 12,550,820 blocks from spawn. As well as notable precision loss when placing objects. This led to series of modern expanding the limits of Minecraft to get a full extent of the structure where the further you go, the artifact breaks down into fringes around a decillion blocks from spawn. Well past the 64-bit and even 128-bit and even the 256-bit limit (I don't know how they do it but they do).
In January 19, 2038 for example (A time eurofags and niggers would sleep through as it happens at 3AM for them), 32-bit computers would enter a timekeeping overflow for the first time as it's epoch is set to January 1, 1970 (setting a time date like that bricks some phones by the way). The Unix surpasses 2,147,483,647 which is the signed 32-bit limit. For these computers, it would roll over to -2,147,483,648 or Year 1903. For 64-bit computers, the first overflow won't happen for billions of years.
The absolute limit 32-bit computers can calculate is close to 10^308 where it rolls over to Infinity or NaN. Desmos is a powerful calculator that takes liberty in Calculating large numbers but can never go past that threshold. You can zoom out as far as 10^300, once it goes that far. You can't zoom out any further as doing so may cause the app to crash or behave erratically.
It is widely understood in video games such as Minecraft that going out far enough causes the game to act more strange. Hence the famous artifact known as the Farlands that would generate at 12,550,820 blocks from spawn. As well as notable precision loss when placing objects. This led to series of modern expanding the limits of Minecraft to get a full extent of the structure where the further you go, the artifact breaks down into fringes around a decillion blocks from spawn. Well past the 64-bit and even 128-bit and even the 256-bit limit (I don't know how they do it but they do).