GPUs & CPUs & Enthusiast hardware: Questions, Discussion and fanboy slap-fights - Nvidia & AMD & Intel - Separe but Equal. Intel rides in the back of the bus.

The 4070 was even supposed to be $750 originally, despite really being a 4060
Remember the 12GB 4080 fiasco? That's the sort of shameless transparent greed that would tarnish a normal company so bad that it would enter a "beginning of the end" phase. Its funny that I went from not knowing much about hardware pre 2020 to BURN NVIDIA TO THE GROUND since the 4000 series dropped.

I can't imagine how much longer they can keep bleeding their simps dry before they stop buying their shit, if a normie like me can see through their horseshit.
 
Remember the 12GB 4080 fiasco? That's the sort of shameless transparent greed that would tarnish a normal company so bad that it would enter a "beginning of the end" phase. Its funny that I went from not knowing much about hardware pre 2020 to BURN NVIDIA TO THE GROUND since the 4000 series dropped.

I can't imagine how much longer they can keep bleeding their simps dry before they stop buying their shit, if a normie like me can see through their horseshit.
There are people who worship Nvidia literally because they're "sharks". Like, why would a consumer ever respect a company's ability to screw the market to the wall? It makes no sense unless you're a shareholder.

Same shit in the bulldozer days when Intel fanbois were cheering on the destruction of AMD. Dumb niggers apparently would be happy if your only cpu choice was Intel.
 
I want to like AMD, as they are by far the easiest of the big three to deal with, but their software is so, so, so incredibly bad. Everything ROCm/HIP/etc related that I've had the misfortune to interact with is a steaming pile of shit. NVIDIA's is a good deal better, but I strongly dislike them as a company. Intel's sort of in the middle. Not as nice to deal with as AMD, but not horrible like NVIDIA, and their hardware is just behind, but their software is the best, and it's not close. I want Intel datacenter GPUs to win solely because SYCL is a billion times better than CUDA or HIP.
 
Last edited:
I want to like AMD, as they are by far the easiest of the big three to deal with, but their software is so, so, so incredibly bad. Everything ROCm/HIP/etc related that I've had the misfortune to interact with is a steaming pile of shit. NVIDIA's is a good deal better, but I strongly dislike them as a company. Intel's sort of in the middle. Not as nice to deal with as AMD, but not horrible like NVIDIA, and their hardware is just behind, but their software is the best, and it's not close. I want Intel datacenter GPUs to win solely because SYCL is a billion times better than CUDA or HIP.
I held off on buying my 6900XT for so long solely because I was waiting for Intel GPUs. I was going to buy one whether it was any good or not, just on the basis of Intel iGPUs being the only kind of graphics I'd never had any problems with on Linux. But it was delay after delay, so in the end I had to buy AMD anyway. And wouldn't you know, driver issues. Not anything on the level of Nvidia's proprietary garbage driver, but a licensing issue from the HDMI consortium who apparently insisted that Linux users only be allowed 4k120Hz if we're restricted to chroma 422.
 
I can't imagine how much longer they can keep bleeding their simps dry before they stop buying their shit, if a normie like me can see through their horseshit.
People have been saying that they were gonna burn nvidia to the ground since the 2080 and it never happens. Until AMD or Intel gets their shit together (not likely anytime soon) and completely btfo's nvidia like how ATi was dabbing on them in the early to mid 2000's, expect the 5090 to be $2000, the 6090 to be $3000 and so on and so forth.
 
  • Thunk-Provoking
Reactions: Brain Problems
People have been saying that they were gonna burn nvidia to the ground since the 2080 and it never happens. Until AMD or Intel gets their shit together (not likely anytime soon) and completely btfo's nvidia like how ATi was dabbing on them in the early to mid 2000's, expect the 5090 to be $2000, the 6090 to be $3000 and so on and so forth.
Like I said I'm still new to this whole thing but I'm gonna read about that just to get some satisfaction about a time when Nvidia was being used as a punching bag for another corporation.
 
4070: 5888 CUDA cores @ 1.9-2.5 GHz, 12 GB GDDR6 @ 504.2 GB/s
4060: 3840 CUDA cores @ 2.3-2.5 GHz, 8 GB GDDR6 @ 288 GB/s

Overpriced or not, those aren't the same card.
That doesn't really mean much. The 4080 12gb was a renamed 4070ti, eventhough it was vastly different from the actual 4080.

Who's to say the 4070 isn't what was originally going to be a 4060ti, until Nvidia realized they can get away with yet another significant product shift upwards?
 
That doesn't really mean much. The 4080 12gb was a renamed 4070ti, eventhough it was vastly different from the actual 4080.

Who's to say the 4070 isn't what was originally going to be a 4060ti, until Nvidia realized they can get away with yet another significant product shift upwards?

He didn't say it was originally planned to be branded 4060 Ti. He said it was "really a 4060." And it's just not the same card. Besides, it's not like a GPU brand means anything...4060 Ti is on the way, and it's rumored to be between the 4060 and the 4070, which is how that branding usually goes.

Unrelated, I've benchmarked the same code on an M1 MB Pro (10-core), i9-12900, a Ryzen 7 6800U. This is a simple code I wrote in Python that throws out some decently heavy numpy calculations to as many processes as you'll give it. Timings:

M1 10c: 64s
i9-12900: 86s
6800U: 110s

On top of being significantly faster, the x86 chips are of course spewing heat everywhere and maxing out their fans during the run. This is really absurd, and both intel and AMD should feel bad about themselves.
 
  • Like
Reactions: The Ghost of Kviv
On top of being significantly faster, the x86 chips are of course spewing heat everywhere and maxing out their fans during the run. This is really absurd, and both intel and AMD should feel bad about themselves.
Cool? Tell me when Apple can play my games. I really give no shits for code execution, and neither do most people. Buy the tool best suited for your needs. There are things x86 chips can do that Apple can't even imagine.

I get it. you love Apple for what it does. Within its walled garden it's good.

Also yes, GPU brand does mean something. Those were tiers of cards that have vast precedent backing them up. This is why Nvidia gets away with tacking $600 price tags on to mid range hardware. People just go "lol just a name". Just think, a 3070 was pretty much a 2080ti minus some vram. The 4070 isn't even quite a 3080. These names to imply something.
 
Last edited:
Cool? Tell me when Apple can play my games.

I get it. you love Apple for what it does. Within it's walled garden it's good.

Python runs on any platform. And it's not even my laptop (it doesn't run the world's greatest operating system, Windows 11), no need to get so assblasted about it.
 
The 4070 isn't even quite a 3080. These names to imply something.
Oh yeah, I vaguely remember the 3080 beating the 4070 in the video review I glanced at. And it loses in Tom's Hardware's review. Just turn on DLSS3!

I wonder if AMD is going to make any adjustments to the VRAM for the 7700 XT and below now that everyone is screaming about 8-12 GB not being enough.
 
3080 has less memory and doesn't support DLSS3.


My CPU benchmarks aren't for you.
3080 12gb exists, and oh boy, dlss3. Nvidia selling weaker hardware for more because of good (for now, and sometimes limited) software.

As for the benchmarks, you quoted me with them so I thought it had something to do with the previous cpu talk. If not, then disregard.
 
3080 12gb exists, and oh boy, dlss3. Nvidia selling weaker hardware for more because of good (for now, and sometimes limited) software.

I'm not looking up the article, but the vast majority of NVIDIA card users use DLSS if it's available (the number I have in my head is 70%), which is why they're going all-in on DLSS improvements being the key differentiator for the next generation of graphics cards. like what T&L ended up being 20 years ago.

I don't know if they're right. Personally, I've been pretty unimpressed with raytraced graphics with DLSS. I'd rather just have rasterization and a card that doesn't put the fire department on notice.

As for the benchmarks, you quoted me with them so I thought it had something to do with the previous cpu talk. If not, then disregard.

They were in fact unrelated, sorry if that wasn't clear - my comments on the 4000 series GPUs were directed at you. The CPU stuff was "unrelated," more directed at @snov, really, since she and I do a lot of the same stuff with computers, and I was earlier bitching about how much power my AMD-based laptop sucks down.
 
but the vast majority of NVIDIA card users use DLSS if it's available (the number I have in my head is 70%)
Are those the figures provided by the telemetry data in the nvidia drivers? If so I'm always skeptical of that because I think it also claimed like 40+% of owners have "turned on RT", but it probably doesn't distinguish between people who use it often, and those who turned it on one time just to check something.

Or if a game has it on by default such as how Hogwarts Legacy had FSR turned on automatically for my 6800xt, which I then promptly turned off. But technically that makes me a "FSR user" lol.

Idk. I'm bummed that it's becoming a race of who can shore up weak hardware the best and fastest with software because the consumer base will now willingly pay more for inferior hardware because of marketing. Like there are people who actually consider more vram as a negative. "Why does it need so much?" Why would anybody ever ask that? Just be happy to have it.
 
  • Informative
Reactions: Brain Problems
Are those the figures provided by the telemetry data in the nvidia drivers? If so I'm always skeptical of that because I think it also claimed like 40+% of owners have "turned on RT", but it probably doesn't distinguish between people who use it often, and those who turned it on one time just to check something.

Or if a game has it on by default such as how Hogwarts Legacy had FSR turned on automatically for my 6800xt, which I then promptly turned off. But technically that makes me a "FSR user" lol.

Idk. I'm bummed that it's becoming a race of who can shore up weak hardware the best and fastest with software because the consumer base will now willingly pay more for inferior hardware because of marketing. Like there are people who actually consider more vram as a negative. "Why does it need so much?" Why would anybody ever ask that? Just be happy to have it.
Data from millions of RTX gamers who played RTX capable games in February 2023 shows 79% of 40 Series gamers, 71% of 30 Series gamers and 68% of 20 Series gamers turn DLSS on. 83% of 40 Series gamers, 56% of 30 Series gamers and 43% of 20 Series gamers turn ray tracing on.
Data from RTX 20 Series gamers who played RTX-capable games in 2018.
Ars thinks it's GeForce Experience users.
 
Ars thinks it's GeForce Experience users.
Yeah, that's the automatic telemetry stuff. I imagine it logs you as a +1 statistic as soon as you either enable the setting, or if it gets enabled by default. It would be extremely easy for Nvidia to pad out the numbers. There's no way in hell nearly half of Turing users actually play games in RT, unless it's absolute shit mode. Most of those cards were pretty bad at it.
 
Are those the figures provided by the telemetry data in the nvidia drivers? If so I'm always skeptical of that because I think it also claimed like 40+% of owners have "turned on RT", but it probably doesn't distinguish between people who use it often, and those who turned it on one time just to check something.

Or if a game has it on by default such as how Hogwarts Legacy had FSR turned on automatically for my 6800xt, which I then promptly turned off. But technically that makes me a "FSR user" lol.

Idk. I'm bummed that it's becoming a race of who can shore up weak hardware the best and fastest with software because the consumer base will now willingly pay more for inferior hardware because of marketing. Like there are people who actually consider more vram as a negative. "Why does it need so much?" Why would anybody ever ask that? Just be happy to have it.

I don't think that's it. The 4070 has about 25% more transistors than the 3080, and they're being used for something, not nothing. I think it is NVIDIA trying to find a way to reinvigorate GPU sales, which are at their lowest point in ages, and the only thing they know to do is more, bigger, hotter, faster. They need that next big thing, and they're hoping it's raytracing. The problem is that raytracing is extremely expensive, especially to get a result noticeably better than the latest rasterization techniques, as in more than just, "oh, nice reflections, I guess." DLSS is about bringing down that cost so it can actually work at playable frame rates. Personally, I think GPUs have overshot the market. When the #1 game on Steam is 11 years old, maybe people just don't care that much any more.
 
Back