GPUs & CPUs & Enthusiast hardware: Questions, Discussion and fanboy slap-fights - Nvidia & AMD & Intel - Separe but Equal. Intel rides in the back of the bus.

HU about green's newest technology:
I love seeing these videos where they go on about making sure you have a 4K set up and fuckoff monitors and how if you're benefiting from >100fps at 4K, DLSS will kick your gear into the 200fps region and I'm sitting here on 1080p fucking with antialiasing and texture quality to try and make sure I can get a steady 40-60fps if I'm lucky.

Genuinely curious who affords these rigs or am I just so old and out of touch with today's youth that spending four grand on your gaming machine is totally normal now? Because even before I had a house, family, car etc. this still would have been a hilarious amount of money to drop.
 
I love seeing these videos where they go on about making sure you have a 4K set up and fuckoff monitors and how if you're benefiting from >100fps at 4K, DLSS will kick your gear into the 200fps region and I'm sitting here on 1080p fucking with antialiasing and texture quality to try and make sure I can get a steady 40-60fps if I'm lucky.

Genuinely curious who affords these rigs or am I just so old and out of touch with today's youth that spending four grand on your gaming machine is totally normal now? Because even before I had a house, family, car etc. this still would have been a hilarious amount of money to drop.
No, it's not normal. Just influencers and marketing makes it seem normalized.
 
Genuinely curious who affords these rigs or am I just so old and out of touch with today's youth that spending four grand on your gaming machine is totally normal now? Because even before I had a house, family, car etc. this still would have been a hilarious amount of money to drop.
I honestly think it's just a feedback loop of influencers having the best stuff and advertising it to the 10% of retards with the money that are the loudest faggots that constantly talk about their monster setups. 9-12 months pass and the cycle continues.

I still get annoyed at myself for buying a 3070 FE for MSRP when I really wanted a 3060 ti but this was peak covid so I grabbed what I could. Any setup over 1200 is overkill if you ask me.
 
I love seeing these videos where they go on about making sure you have a 4K set up and fuckoff monitors and how if you're benefiting from >100fps at 4K, DLSS will kick your gear into the 200fps region and I'm sitting here on 1080p fucking with antialiasing and texture quality to try and make sure I can get a steady 40-60fps if I'm lucky.

Genuinely curious who affords these rigs or am I just so old and out of touch with today's youth that spending four grand on your gaming machine is totally normal now? Because even before I had a house, family, car etc. this still would have been a hilarious amount of money to drop.
I don't think it's actually that expensive to get into 4K anymore, especially after recent price drops. An RTX 3070 might be about good enough and 4K TVs are appearing under $200. Whenever you see "ultra" settings being used in benchmarking, those are generally overkill and can be dropped for better FPS.

But let's look at Steam Survey:

1920x1080 = 66.38%
2560x1440 = 11.25%
1366x768 why won't you diieeee = 5.55%
3840x2160 = 2.46%
 
I love seeing these videos where they go on about making sure you have a 4K set up and fuckoff monitors and how if you're benefiting from >100fps at 4K, DLSS will kick your gear into the 200fps region and I'm sitting here on 1080p fucking with antialiasing and texture quality to try and make sure I can get a steady 40-60fps if I'm lucky.

Genuinely curious who affords these rigs or am I just so old and out of touch with today's youth that spending four grand on your gaming machine is totally normal now? Because even before I had a house, family, car etc. this still would have been a hilarious amount of money to drop.
Just wait until you find out that the DisplayPort 1.4 ports on the 4090 only support 4k at 120 FPS without compressing the image.
So this is really more so for 1440p 144hz. Or maybe 1080p 240hz. 720p 500hz ?
Pretty sure it can run almost all games at those Resolutions+Framerates natively, but hey, the card is starting at only 1,599$.
Nvidia has save money somewhere, y'know. Switching to 2.0 (introduced in 2019 btw) would cost at least 50ct-2$ more.
dp lol.PNG
Intels 300-400$ ARC meme GPUs also have 2.0.
Not to mention that the monitors with 2.0 are probably going to be released next year or so and by then you may as well upgrade to the 4090ti anyways.
el-risitas-laughing.gif
 
1366x768 why won't you diieeee = 5.55%
1680x1050 is the only widescreen resolution worth a damn, still fits in programs sized for 1280x1024 just fine, but lets you play 360p webrips of TV shows in the proper aspect ratio.
 
I don't think it's actually that expensive to get into 4K anymore, especially after recent price drops. An RTX 3070 might be about good enough and 4K TVs are appearing under $200. Whenever you see "ultra" settings being used in benchmarking, those are generally overkill and can be dropped for better FPS.

But let's look at Steam Survey:

1920x1080 = 66.38%
2560x1440 = 11.25%
1366x768 why won't you diieeee = 5.55%
3840x2160 = 2.46%
The 3070 is NOT a 4k card if you care about the settings. Also $200 4k tvs look like shit.
 
  • Disagree
Reactions: Zero0
No real opinion on this but they are marketing very, very fancy smooth-movement tv technology to a crowd that hates that their parents thinks it looks fine when watching a movie at home.
Was funny seeing people say "nuh uh, it's not simple frame interpolation, it's fancy AI(tm) stuff only Nvidia can do"
 
I've had my eye on a 5800X3D for a bit because it sounds like it's something that could last me forever a long time considering I prefer to spend a bit more now and don't have to wait and wait for AM5 hardware to not be insane. Am I being ridiculously over-the-top with this idea? I would mostly use it to play games and do some light video and audio editing on the side and a bit of Blender.

The GPU is also haunting me because I want to follow a similar course and get something that maybe costs a bit more now but doesn't have to be bleeding edge and will last me for a while. I'm sitting on 1080p right now but I could consider upgrading my monitor if 1440p is really worth the shift up. Thinking of 6700XT or whatever the nVidia equivalent is and watching the local buy/sells. Seems like prices keep dropping and dropping and some scalpers might be sweating, which is nice to see.
 
It's insane they have the audacity to pull this sort of crap.
Truly corporations have no shame and must be bullies 24/7
If the 4090 really is what the 4080 would be that would make the upcoming 4080 a "4070" and the 4080 12GB a "4060". 80% of their product range is suddenly labeled as 80-series or better.
 
  • Agree
Reactions: Allakazam223
If the 4090 really is what the 4080 would be that would make the upcoming 4080 a "4070" and the 4080 12GB a "4060". 80% of their product range is suddenly labeled as 80-series or better.
I don't think the 4090 is the 4080, they literally changed the product stack, tremendously increasing the gap between the 80 and 90, which must be something like GTX 1080 vs GTX 1060 or something.
Crypto and inability to compete from AMD have been disastrous. It seems like another planet buying a 300EU GTX 970 and getting near peak performance, and it was just 7 years or so ago,.
 
Another reason why to this day I'm happy to say I've never given Nvidia a single cent.

Since my 9800gt, I've owned both brands pretty much 50/50. I've bought AMD cards brand new, but only used Nvidias.

They tried playing that shitty rebranded 4070 off as "oh it's just a 4080 with less vram". They truly think of their customers as a bunch of brainless zoo animals.

Fuck Nvidia and anyone who stands up for this bullshit.

*Edit* @AgendaPoster What do you mean "AMD can't compete"? They have always had very competetively priced models for a long time. Maybe not always flagship challenging, but why should that matter for 99% of PC builders who don't buy that shit anyways?
 
Last edited:
  • Like
Reactions: Allakazam223
If the 4090 really is what the 4080 would be that would make the upcoming 4080 a "4070" and the 4080 12GB a "4060". 80% of their product range is suddenly labeled as 80-series or better.
The 4080 16 GB and 4080 12 GB were 2 different GPUs based on different dies, with the latter using the die number that would usually correspond to a 70-class GPU. The real 4080 has 26.7% more CUDA cores and 46% higher memory bandwidth than the canceled card. Source.

I have seen people arguing that The Card Formerly Known As 4080 12 GB was actually a 4060 Ti or 4060, but I think that's mostly for the lulz.

At least Nvidia now has a chance to repair its lineup before it commits to mistakes, like the knee jerk inclusion of 12 GB in the RTX 3060.

BTW, the 4050 is confrimed.
 
Back