GPUs & CPUs & Enthusiast hardware: Questions, Discussion and fanboy slap-fights - Nvidia & AMD & Intel - Separe but Equal. Intel rides in the back of the bus.

Seeing how strong this low-end/mid PC was, I was thinking of maybe doing a full new build with AM5 and maybe a 9700X, but I could get a 5900XT for 300 bucks from MicroCenter. Instead of a thousands build, just jacking up the CPU to the highest current AM4 would help. Thoughts?
I was going to say AM4, but then I saw a 9700X bundle for $400 at Micro Center. Including B650 motherboard and 32 GB. MC chads win again. Though if you're looking to do another ITX, these motherboards they toss away are usually ATX with some MicroATX.
 
I have a Microsoft Intellimouse from 2001 that still works in Windows 11.
I don't believe that for a second. The cord/connection would always get worn out and give up. It's my favorite mouse ever and so many of them have died because of that. Plus it's just a HID device.
When they were developing CUDA, many, many end users, including me, complained to them that a proprietary programming language is intrinsically risky,
CUDA is cg 2 in a way but this time Nvidia got their way.
 
The 50 series is made on same process node as the 40 series, so every 40 series chip they make is a 50 series chip they don't make. There's no reason to continue making the 40 series at all.
Besides, both the 40 and the 50 series share the same faulty power delivery design which is the biggest concern aside from the lack of value and asinine pricing, and the 4090's aren't that much cheaper than the 5090's will be once the pricing and supply stabilizes, with every other offering than that being equally shit in value. I can see the 50 series directly replacing the 40 series offerings, with all of them being as unattractive as ever. I guess the only reason you'd be upset is if you wanted to play the handful of PhysX games that Ada supported but Blackwell doesn't.

You can still get second hand 40 series cards, or even 30 series, and chances are Ampere will be more than sufficient for most if they don't need AV1 encoding or fake frames. The 3060 still tops the Steam Survey after all, and refurbed 3090's go for decent prices nowadays. Slightly worse performance than a 4070Ti, higher power draw/heat output which can be culled a bit with undervolting, double the VRAM and a proper power delivery design with 2-3 6+2 pin connectors. With how the 40 and 50 series are going, and if you're not the type of person to always play the latest and greatest slop, it's still a great proposition if you're feeling your current GPU is lacking and you're looking for an upgrade.
I don't believe that for a second. The cord/connection would always get worn out and give up. It's my favorite mouse ever and so many of them have died because of that. Plus it's just a HID device.
With a soldering iron, a replacement cable and a bit of patience you can keep a mouse running forever. If it's USB, that's four wires to solder and a bit of craftsmanship to keep it in place. If it's internals fail however, then it's as good as dead, but the cable is the simplest thing to replace in a mouse, aside from the switches.
 
With a soldering iron, a replacement cable and a bit of patience you can keep a mouse running forever. If it's USB, that's four wires to solder and a bit of craftsmanship to keep it in place. If it's internals fail however, then it's as good as dead, but the cable is the simplest thing to replace in a mouse, aside from the switches.
Yes, but I rather pay than go through the hassle of..,
It's my spare backup mouse, not my daily driver.
It is a really nice mouse, I liked the size and how light it was. Using one now is like picking up an original PSX controller. It weighs nothing. It was great as a home computing accessory, I always resented that the scroll wheel replaced the third button on the mouse
 
Lmao, good job Nvidia.
1000018398.jpg
 
I think we're beginning to hit a wall graphically. A lot of the newer games aren't optimized for shit to take advantage of more powerful hardware (due to DEI hires that can't code worth a damn). Uninspired game design as a whole as well as the cost of these cards keep going up because NVIDIA are assholes. They don't seem to know what the fuck they're doing a lot of the time now, and their pricing is all over the fucking place. I think when I get my new machine built next year, I'm just going to throw a 4060 in it and use that for the next 10+ years. I'm already on year 5 on my RX 580, and that's about to reach the end of its life cycle (but it's still plenty for Xbox One / PS4 era titles).

It's sad to see people just not give a shit anymore, I dunno.
 
DEI hires that can't code worth a damn
nigger, noone is hiring Shanuiqa to 'code' anything. The problem is Donald Trump's favorite minority group after niggers and faggots, namely Indians, who are hired due to the impression that they produce code of comparable quality to humans at a lower cost (false).
 
  • Autistic
Reactions: marvlouslie
nigger, noone is hiring Shanuiqa to 'code' anything. The problem is Donald Trump's favorite minority group after niggers and faggots, namely Indians, who are hired due to the impression that they produce code of comparable quality to humans at a lower cost (false).
It's not even necessarily indians, but just modern programmers in general. A generation ago they had to work much closer to the machine. Nowadays, most of their work is abstracted away in engines and objects they can just import, so they do. Nevermind that they already have an object which does just about the same thing, but not closely enough that the compiler will recognise to optimise it away. So that's an extra context switch, and those have always been expensive (X3D is so good because the extra cache means it's more likely to have the new context ready in merely semi-slow cache, rather than forcing the thread to wait for the glacial pace RAM moves data at).

Thankfully, AI is getting really good at coding. DeepseekR1 is already on par with a three year computer engineering student, ie the fresh graduates game studios usually hire, next generation models will probably be on par with a programmer with a master's in applied mathematics (the type of programmer that actually still learns optimisation). Studios will make a design document (AI can already do this very well), assign programmers to use AI to write the blocks, and have other AI stitch them together and optimise the product. And thanks to even more AI, the game can actually be tested before release, rather than be unplayable for weeks while the worst bugs are patched out.
 
It's not Indians or programmers, it's rose-tinted glasses. People are remembering old games as running far better than they did because they're running them on hardware that's 5-10 years newer. But mentally, the 2005 game and the 2015 GPU are both "old," so your brain just sees "old game runs great on old hardware."

Studios will make a design document (AI can already do this very well), assign programmers to use AI to write the blocks, and have other AI stitch them together and optimise the product. And thanks to even more AI, the game can actually be tested before release, rather than be unplayable for weeks while the worst bugs are patched out.

I doubt AI is going to fix anything, because the main performance bottleneck isn't the lack of clever code optimizations, it's sheer bloat. Like one of the most infamous examples is how Call of Duty uses uncompressed sound now. You can batch compress sounds with the press of a button, and it's all done with native libs, zero coding needed. They don't compress sounds because they don't feel like it.

There are also all kinds of automated tools to automatically improve code performance and devs simply refuse to use them. Source: I was the guy who used all these tools and trained others to use them, and they just didn't use them. I figure our product would have been 2x-3x faster if people hadn't been so fucking lazy.
 
Whoever at Nvidia decided to discontinue the 40 series deserves to be beaten with a tire iron.
thats a normal thing, makes no sense to continue buying fab space on the same node for your old cards, it would literally just be selling the same silicon for less
It makes perfect sense why they do it.
 
It's everything. Just look at GTA: San Andreas on the PS2. The hardware was limited, the ambitions were big, and R* pulled it off by having talented programmers that knew when to write complex code to push the hardware to it's limits, and when to use simple tricks that worked well enough and saved on resources. Not only that, due to those hardware limitations they couldn't have a photorealistic game so they've done their best to achieve a distinctive look. Graphically, it does look badly dated, with the low poly models and blurry textures, but stylistically, it still holds up. The orange sunrise over 90's LA suburbs will still look as good as ever.

Nowadays we have an overabundance of hardware capabilities, but instead of bringing more advanced games and graphics, it brought upon stagnation, or even regress, as this overabundance has been abused as a buffer for dev laziness. Who cares about making things well if you can just half-ass everything and hope that whoever plays it has a 4070Ti so it'll even out? Plus, since the "photorealistic graphics" rabbit chase has only picked up speed, it only led to games aging like milk. Every game will age graphically, but when you ditch all style for the sake of "realism", your game has nothing to withstand the trial of time. The moment graphics technology gets older, so does the visual style of the game, while a ton of PS2 titles are still good visually despite being technologically outdated.

Just look at the games from the Pascal era. Metal Gear Solid V, The Witcher 3: Wild Hunt, Grand Theft Auto V on PC. I remember playing TW3 and GTA V on my 1060 and it already looked and ran great, on a mid-range card from the same time those games were released. Games knew how to use the hardware that was available and it was all already this close to catching that photorealism rabbit. But something went wrong, games started to look only marginally better while running significantly worse, and now we've hit the wall where in many cases the games are looking worse than those 2015 titles while running worse than those 2015 titles. But instead of tackling the source of this issue, everyone is gaslit into believing that their hardware is at fault, that they need to buy the newest RTX card with 4x frame generation to make the games run well, and everyone is buying into this lie for some reason.

Not to mention the whole ray tracing scam where the games, again, only look marginally better than what they already looked like with the conventional 3D graphics tricks while significantly raising computing demand on the hardware. For what exactly? For more reasons to convince you to buy yet another fancy, expensive GPU, when in reality all that is needed is for game developers to do their fucking job right.

Remember: everything "AI" that Nvidia is throwing out is there to excuse the dev laziness even further, not to fix the sorry state of games today. If everyone, from individual game studios to game engine developers, gave two shits about actually harnessing the computing power of today, there would be no need to abuse DLSS upscaling and frame generation to make the games playable. They would only be there as a little extra, not a mandatory component, like filling in the gaps to achieve high frame rate 4K gaming, with everything below it being achievable natively.
 
I doubt AI is going to fix anything, because the main performance bottleneck isn't the lack of clever code optimizations, it's sheer bloat. Like one of the most infamous examples is how Call of Duty uses uncompressed sound now. You can batch compress sounds with the press of a button, and it's all done with native libs, zero coding needed. They don't compress sounds because they don't feel like it.
I can see AI lowering game sizes, if (((they))) feel like it. Nvidia just showed off Neural Texture Compression, which is similar to AMD's Neural Texture Block Compression. I know voice acting alone has hit at least 50 GB for some open world games, and that could be replaced with licensed voice patterns for text-to-speech. Collision sounds could be calculated in real time instead of stored, saw a demo of that years ago.

If developers don't care, gamers don't complain, or the technologies are too proprietary or hardware intensive, then everyone can continue to eat shit as usual. There may be copyright and personality right disputes over the AI voice cloning, especially after someone mods GTA/TES VII to make everyone say nigger, which is much worse than hot coffee.
Not to mention the whole ray tracing scam where the games, again, only look marginally better than what they already looked like with the conventional 3D graphics tricks while significantly raising computing demand on the hardware. For what exactly? For more reasons to convince you to buy yet another fancy, expensive GPU, when in reality all that is needed is for game developers to do their fucking job right.
I belieb that there are benefits, but they'll only be fully realized after at least another console generation PlayStation debuts and probably longer to make it mandatory and normalized in the industry, so like 2030-2035. And of course they want to sell you more hardware. If developers don't make their games unoptimized enough and everything starts to run at 4K240 on entry-level hardware, Nvidia will collude with them to push the next big thing.
 
It's everything. Just look at GTA: San Andreas on the PS2. The hardware was limited, the ambitions were big, and R* pulled it off by having talented programmers that knew when to write complex code to push the hardware to it's limits, and when to use simple tricks that worked well enough and saved on resources.

It still ran at a dodgy 20-25 fps in 480i on its target platform. The games people are crying about being "poorly optimized" merely run at 60 fps at 1080p. Would people really praise the optimization if a brand-new game bit off more than the hardware could chew and ran at 25 fps & 720p? Also, funnily enough, lots of PS2 games were rendered internally at 640x240 and relied on the TV's interlacing to give the impression of 640x480. Many games also used the console's hardware motion blur to look smoother than they were. It's not substantially different than upscaling and frame generation.

Nowadays we have an overabundance of hardware capabilities, but instead of bringing more advanced games and graphics

We're still getting more advanced graphics. The games suck for other reasons, mostly related to corporatism, the monetization treadmill, and wokeness.

Not to mention the whole ray tracing scam where the games, again, only look marginally better than what they already looked like with the conventional 3D graphics tricks while significantly raising computing demand on the hardware.

Because the conventional tricks ran into their limits, and so doing things more accurately for more realism requires an enormous amount of cost. It's not a scam, it's just the reality of the technology S-curve, which is itself just an expression of the 80/20 rule.

1740191098685.png
Ray tracing is a great example. Let's say you notice the artifacts and problems with tricks like cube mapping and SSR 10% of the time. Well, you can't fix them by just doing 10% extra work using the same old techniques. You have to toss them in the garbage and do physically accurate light calculations. The last 10% costs 100x more than the previous 90% put together.

That cost is not because devs are lazy or because NVIDIA has scammed you, it's because we've pushed cheap tricks as far as they can go, and the next frontiers in rendering realism will amount to marginal improvements at enormous cost. The same thing has happened in everything from automobile fueling systems to air conditioners.
 
Because the conventional tricks ran into their limits, and so doing things more accurately for more realism requires an enormous amount of cost. It's not a scam, it's just the reality of the technology S-curve, which is itself just an expression of the 80/20 rule.
Except we don't need realism, and the talent involved with faking it is inherently more artistic than spamming ray tracing everywhere. The same thing happened with film; CGI allowed you to go anywhere and do anything and in doing so lost a core part of what made film imaginative.
 
I'm still trying to figure out how the fuck a lot of these tech youtubers can get ahold of these high end cards when the average consumer can't find them anywhere.
 
Back