GPUs & CPUs & Enthusiast hardware: Questions, Discussion and fanboy slap-fights - Nvidia & AMD & Intel - Separe but Equal. Intel rides in the back of the bus.

What's the deal with curved monitors? Are they good, or shit? I haven't ever sat in front of one so I have no idea.
I've tried one out a few years ago, it was very nice. The cost is still far too prohibitive for solely entertainment. I did consider getting it for work, since it felt better on the eyes to have the edges closer to me that a usual ultrawide.
 
You know what? Let's talk water cooling. So I do custom looping for fun, not show, and I like to save a buck when I can. When trying to look up if anyone has used (insert cheap solution here), nowadays all you find are people buying the same exact overpriced shit doing the same exact loops in the same exact cases, etc, etc, and having a big circle jerk over it.

I'm absolutely disgusted by how ritzy what used to be a tinkerer's hobby has become.
 
All I want is a fucking Asus White 3090 at a reasonable price. I know it's dated now (I guess, still gonna be strong for a while), but fucking scalpers and shit just won't lower prices for anything anymore. I'm not paying damn near two grand for a used item that's technically out of date.
 
All I want is a fucking Asus White 3090 at a reasonable price. I know it's dated now (I guess, still gonna be strong for a while), but fucking scalpers and shit just won't lower prices for anything anymore. I'm not paying damn near two grand for a used item that's technically out of date.
Just take the shroud off and paint it. Bam. Done.
 
  • Like
Reactions: Dimensional Sperge
yeah im going full amd this time: 5600 and the 6700xt. also for the first time im going to ditch hdd and going full ssd
im going to mount everything on a MSI PRO B550M-VC WiFi ProSeries
I have a 5600x with 4 B-Die Vipers and 3060 Ti now. It's more than good enough for everything I need. Hopefully someday, AMD/Nvidia has to price things reasonably for the majority.
 
Current Steam survey:
st.png


It's going to be interesting to see what nvidia do. Part of the huge price increase of the 4000 series is the terrible deal they're getting from TSMC due to 1) the deal being inked during the height of demand and 2) them coming back groveling after switching to Samsung.

Obviously for the halo products they've just let price soar and held inventory back, but neither are options for the mainstream cards. They either up the price and lose volume, or wear the extra costs and lose profit.
 
It makes me wonder wtf Nvidia is doing. 50 and 60 series cards doesn't attract customers just by the model number, it's the price point people like and that price point is fucked right now. The cheapest 3050 I can find (where I live) is~ €430. What's even more absurd is that the cheapest 3060 is just 20 euros more expensive at the same shop. There's no price progression like in the past, if someone wants Nvidia it starts at 400+ or if that's too much just enjoy the integrated graphics or get an absurdly priced 1030.

The RX 6400 is fucking stupid but AMD at least tries to do something.
 
It makes me wonder wtf Nvidia is doing. 50 and 60 series cards doesn't attract customers just by the model number, it's the price point people like and that price point is fucked right now. The cheapest 3050 I can find (where I live) is~ €430. What's even more absurd is that the cheapest 3060 is just 20 euros more expensive at the same shop. There's no price progression like in the past, if someone wants Nvidia it starts at 400+ or if that's too much just enjoy the integrated graphics or get an absurdly priced 1030.

The RX 6400 is fucking stupid but AMD at least tries to do something.

That's way above list price, so what you're seeing is likely more a matter of availability than anything else. Plus, NVIDIA isn't making those cards any more, they're all made by licensees now.

This is purely speculation on my part, but I strongly suspect that down-market demand is heavily affected by whoever's waving their dick in the press about having the current most powerful video card in existence. This creates a brand association in people's mind that "NVIDIA = power, AMD = budget," so they will pay insane price premiums for NVIDIA cards. You'll often see an NVIDIA card here go for $100-$200 more than an AMD card that puts up nearly identical benchmark scores.

And to be fair to NVIDIA, AMD also had serious quality issues a couple years ago, and poor quality is a tough reputation problem to overcome.
 
That's way above list price, so what you're seeing is likely more a matter of availability than anything else. Plus, NVIDIA isn't making those cards any more, they're all made by licensees now.

This is purely speculation on my part, but I strongly suspect that down-market demand is heavily affected by whoever's waving their dick in the press about having the current most powerful video card in existence. This creates a brand association in people's mind that "NVIDIA = power, AMD = budget," so they will pay insane price premiums for NVIDIA cards. You'll often see an NVIDIA card here go for $100-$200 more than an AMD card that puts up nearly identical benchmark scores.

And to be fair to NVIDIA, AMD also had serious quality issues a couple years ago, and poor quality is a tough reputation problem to overcome.
There's plenty of them available with more coming in(according to what they list) and there's no new 4050s or 4060s. (I also guesstimated the conversation rate to euro, divide by ten used to work but it turns out that our currency is in the shitter, instead of €430 it's more like €390 and that includes 25% VAT)

Having the "performance king" at the top of the charts absolutely have an effect on brand perception. The GeForce FX 5950 Ultra was infamous for being a card that was only produced to get Nvidia to the top on benchmark charts and it was produced in such limited quantities that it is now a collectible. One sold for $200 on eBay just recently which is pretty nuts for a 20 year old graphics card that runs hot as hell. I don't remember the model numbers but Intel pushed out some crazy CPUs to claw themselves back up on the charts and I absolutely believe that is something that influences the sales of i5's.

Hardware Unboxed uses a "frames per dollar" metric which is pretty informative and there should be more focus on that instead of putting it at the end of the video.
 
And to be fair to NVIDIA, AMD also had serious quality issues a couple years ago, and poor quality is a tough reputation problem to overcome.
NVIDIA has some pretty good blunders themselves, even on the flagship cards, but that all gets swept under the rug because Team Green is a cult.

*Edit* Not to mention easily beating AMD in anti-consumer practices and blatantly disrespecting the segment that made them who they are. BUT MUH FPS!
 
yeah im going full amd this time: 5600 and the 6700xt. also for the first time im going to ditch hdd and going full ssd
im going to mount everything on a MSI PRO B550M-VC WiFi ProSeries
i am on a 6700xt with 5700X processor and a Nvme ssd and its going pretty sweet.

Also, thanks to all for the advice.
 
There's plenty of them available with more coming in(according to what they list) and there's no new 4050s or 4060s. (I also guesstimated the conversation rate to euro, divide by ten used to work but it turns out that our currency is in the shitter, instead of €430 it's more like €390 and that includes 25% VAT)

Having the "performance king" at the top of the charts absolutely have an effect on brand perception. The GeForce FX 5950 Ultra was infamous for being a card that was only produced to get Nvidia to the top on benchmark charts and it was produced in such limited quantities that it is now a collectible. One sold for $200 on eBay just recently which is pretty nuts for a 20 year old graphics card that runs hot as hell. I don't remember the model numbers but Intel pushed out some crazy CPUs to claw themselves back up on the charts and I absolutely believe that is something that influences the sales of i5's.

If you remember the GeForce FX 5950, you may remember the GeForce 4 MX. It was based on the same Celsius architecture as GeForce 2, not the Kelvin architecture that the 3 Ti and 4 Ti were based on. But hey, that Doom 3 demo was blowing up in the press, so why not slap the "GeForce 4" brand on a card that would never be able to run it and laugh all the way to the bank?

Hardware Unboxed uses a "frames per dollar" metric which is pretty informative and there should be more focus on that instead of putting it at the end of the video.

Passmark has some nice charts for this.


5700 XT and 6700 XT are very good buys right now.
 
If you remember the GeForce FX 5950, you may remember the GeForce 4 MX. It was based on the same Celsius architecture as GeForce 2, not the Kelvin architecture that the 3 Ti and 4 Ti were based on. But hey, that Doom 3 demo was blowing up in the press, so why not slap the "GeForce 4" brand on a card that would never be able to run it and laugh all the way to the bank?
The Geforce 4 MX was such a fucking scummy move, the rest of the GF4 line was so nice. The GeForce 2 MX was honestly very good though, a cheap card with 256 DDR performance and dual display outputs which was something the mainline cards lacked at the time. Could be modified and turned into a Quadro as well. That only meant it accepted Quadro drivers instead of the recently hobbled Geforce drivers.
 
For those of you too young to remember the 00s, each generation of GPUs in that era was a step change in graphics, introducing features that would not and could not execute on earlier cards. The GeForce 4 MX was not just too slow to run Doom 3 at a playable frame rate. It lacked the instruction set to do per-pixel dot products needed to make normal mapping work at all, or the per-vertex calculations needed for the stencil shadows.
 
For those of you too young to remember the 00s, each generation of GPUs in that era was a step change in graphics, introducing features that would not and could not execute on earlier cards. The GeForce 4 MX was not just too slow to run Doom 3 at a playable frame rate. It lacked the instruction set to do per-pixel dot products needed to make normal mapping work at all, or the per-vertex calculations needed for the stencil shadows.
The previous architectures of geforce could actually do some of it, but it was a vendor specific fixed function operation. Bump mapping that is, not normal mapping. I actually remember the night when a dude figured out normal mapping and released it as a proof of concept using a couple of extra (cheap) instructions to parallax texels in a what was otherwise a bump map. It was flawed as hell but incredibly intriguing.
 
  • Informative
Reactions: Brain Problems
Back