GPUs & CPUs & Enthusiast hardware: Questions, Discussion and fanboy slap-fights - Nvidia & AMD & Intel - Separe but Equal. Intel rides in the back of the bus.

even my 1060 could do ray tracing. would i recommend it? no, because either my cpu was holding it back or the thing was too weak to actually do it. 1080ti will probably do it better tho
No. They'll both be shit at it. GTX cards have no hardware to support ray tracing calculations. Idk. maybe at like some shit settings with 1080p and 30 fps or something....but it's just not a good idea.
 
I finally managed to get my i9-12900 to go over 50% utilization by throwing some statistical moment calculations at it. Quadratures via numpy, then some nested sums and gamma functions. But I probably could bring that down again by doing more evaluate-and-store.

Basically, modern CPUs are really neato.


$140. It was running Darktide fine, and Sons of the Forest. Supposedly this thing supports Ray Tracing as well.

It performs on par with, if not better than, newer cards like the 3060 and 5700 XT that retail for over $300, so yes, you got a very good deal.
 
Last edited:
I bought a EVGA 1080 TI SC 2 Hybrid with 11gb VRAM because prices for 20xx and 30xx cards are still stupid. Did I fuck up?
The 10-series has aged remarkably well. Nvidia essentially had to invent hype around raytracing to sell 20- and 30-series, since nobody with a 10-series in their right mind would upgrade. The only reason 30-series sold as well as it did, apart from the crypto tulip mania, was DLSS promising you would be able to trace rays without completely nuking your frame rate, which was sort of true, but limiting it to the 30-series and newer was also blatantly a marketing trick. AMDs FSR works just fine on the 10-series, and I've even used it with the integrated Intel graphics on my ancient macbook, so there's no reason to think DLSS wouldn't also work on older hardware.

My husband's computer uses a 1080 non-Ti, it plays games in 4k without problem. It can't push past 60fps, but that's still playable. If you can stand playing at a lower resolution, it wouldn't be a problem. As long as you're not planning on using Linux, it'll be fine. 10-series unfortunately isn't getting the next-generation Nvidia Linux driver, so it'll probably never work well with Wayland.
 
  • Informative
Reactions: White Devil
It performs on par with, if not better than, newer cards like the 3060 and 5700 XT that retail for over $300, so yes, you get a very good deal.
I did some comparisons on Versus and apparently most of the newer cards really do is beat this one on features, but not raw performance
 
I did some comparisons on Versus and apparently most of the newer cards really do is beat this one on features, but not raw performance

There are certain tasks newer cards outperform it on. But it doesn't matter, because if your game won't run on 10-series cards, it's not going to run on consoles, either. So enjoy going bankrupt.
 
And finally the brainwashing of what Good is Good in video gaming. IMHO people have forgotten what a good game.
I'm reminded of those AAA devs, salty at Eldan Ring because it didn't follow the current AAA rules for good game.

I finally managed to get my i9-12900 to go over 50% utilization by throwing some statistical moment calculations at it. Quadratures via numpy, then some nested sums and gamma functions. But I probably could bring that down again by doing more evaluate-and-store.

Basically, modern CPUs are really neato.
Does it have a glowing skull on it though?
 
I bought a EVGA 1080 TI SC 2 Hybrid with 11gb VRAM because prices for 20xx and 30xx cards are still stupid. Did I fuck up?
Nope. I still run a EVGA 1070 on my back up computer. You just have to understand what exactly what you want to do with your rig/card.
 

AMD FidelityFX presentation at GDC on March 23.

Look out for FSR 3, which is likely to be AMD's equivalent of DLSS 3 fake... frames. They could also reveal RDNA 3 specific hardware acceleration of FSR.

Don't get me started on how poorly a stunning number of games are optimized for multicore CPU's. Why the fuck this isn't the game dev equivalent of learning "C-A-T spells CAT" by now is beyond me.
Maybe I'm just retarded, but single-thread logic is what makes sense to human beings, and not everything can be parallelized. Sometimes most new bits of game logic depend on everything that came before it to happen sequentially. Not in an open world game, obviously, and sometimes an engine like UE5 can optimize for multi-core for you.

If the situation doesn't improve from the big 2 consoles having 8-core (16-thread optional) Zen 2 after 7 years of having 8-core Jaguar, then it's doomed.
 
Maybe I'm just retarded, but single-thread logic is what makes sense to human beings, and not everything can be parallelized. Sometimes most new bits of game logic depend on everything that came before it to happen sequentially. Not in an open world game, obviously, and sometimes an engine like UE5 can optimize for multi-core for you.

If the situation doesn't improve from the big 2 consoles having 8-core (16-thread optional) Zen 2 after 7 years of having 8-core Jaguar, then it's doomed.

Yes, single-threaded logic is much more intuitive to human beings than multi-threaded logic. But you can say the same about bubble sort, linear search, global variables, and linked lists. It is The Current Year. Multicore CPUs have been mainstream for fifteen fucking years now. If you're still unable to write thread-safe data structures and think in parallel, you're a retarded monkeynigger who needs to go back to being a journalist.
 
Zen4 X3D chip reviews are coming out. Looks like the general consensus is matches 13900k in gaming while not burning meme-tier levels of power in doing so. Simulated 7800X only using an average of 44w to match the 13900k...

7950X3D vs 13900K:
View attachment 4646320
Generally considering TPU the most accurate source for hardware reviews and very high trust, but their 13900k figures are somewhat... weird.
You can see here a 6GHz OC chip pulling about 100W in Cyberpunk 2077 with a 4090. That's half from the 200W in TPU's measurements.
There are other clips from the game, and I've seen one pulling 150W, and with temps spiking to 100C, instead of the 50C seen here in the linked video.
I have a feeling some motherboards severely overvolt the chip on default settings, resulting in this obscene power draw and temps.
This being said, even at 100W, the 13900K is not doing that well... at all. It is cheaper though than a DDR5 7950X3D combo, right? Both need to be cheaper :P
 
  • Informative
Reactions: Brain Problems
Yes, single-threaded logic is much more intuitive to human beings than multi-threaded logic. But you can say the same about bubble sort, linear search, global variables, and linked lists. It is The Current Year. Multicore CPUs have been mainstream for fifteen fucking years now. If you're still unable to write thread-safe data structures and think in parallel, you're a retarded monkeynigger who needs to go back to being a journalist.
I've been working on a dual core microcontroller. The retards are all "Use the mutex and the locks and the xyz." Or, you know I could just use the fact that a byte is still a byte and is still atomic as long as I force the compiler and the CPU to actually write to and read from the memory location. Then again I come from a time where efficiency mattered and you wrote microcontroller code in assembly since the compiler would make it too big for the flash.

Now, if you'll excuse me I need to adjust the onion on my belt.
 

AMD FidelityFX presentation at GDC on March 23.

Look out for FSR 3, which is likely to be AMD's equivalent of DLSS 3 fake... frames. They could also reveal RDNA 3 specific hardware acceleration of FSR.


Maybe I'm just retarded, but single-thread logic is what makes sense to human beings, and not everything can be parallelized. Sometimes most new bits of game logic depend on everything that came before it to happen sequentially. Not in an open world game, obviously, and sometimes an engine like UE5 can optimize for multi-core for you.

If the situation doesn't improve from the big 2 consoles having 8-core (16-thread optional) Zen 2 after 7 years of having 8-core Jaguar, then it's doomed.
If FSR3 is announced and compatible with my 6750XT it would make me very, very happy.
 
If we're talking US pricing, then this is wrong. A 6700xt is almost always cheaper. The 6700XT as a whole outperforms the 3060ti @ 1440p which is where you should be buying this type of card. Unless you're talking about memetracing.

DLSS doesn't mean jack shit anymore as FSR 2 is quite honestly just about as good. CUDA...whatever. That's for people to pretend to do stuff with at such a low price point. If anyone was doing any real work with CUDA, they should really be stepping it up.

8gb of vram is fast becoming a limitation at 1440p as Hogwarts has shown us. Nvidia customers should really stop overpaying for so much for relatively low amounts of vram.

@Freedom Fighter Are you a US shopper? Have a MC or somebody you know who does? If so, that $600 7900X combo is crazy and blows any other deal on the market out of the water right now.
Oldish post but, I ended up getting a 5600x and motherboard combo for $200. Finished rebuilding last night and either I fucked up the cooler install or Ryzen runs a lot hotter than my old 8600K.

On idle, with auto settings, it fluctuates from 50 to 65c, where the fan is pretty loud. Under load it gets hotter and louder obviously.
Thinking about slapping a Noctua NH-D9L on there.
 
Oldish post but, I ended up getting a 5600x and motherboard combo for $200. Finished rebuilding last night and either I fucked up the cooler install or Ryzen runs a lot hotter than my old 8600K.

On idle, with auto settings, it fluctuates from 50 to 65c, where the fan is pretty loud. Under load it gets hotter and louder obviously.
Thinking about slapping a Noctua NH-D9L on there.
If you are running the boxed cooler then i would 100% recommend getting something quieter on there. My 2700x came with the Wraith Prism and even that thing was noisy.
The heat bursts in idle/during low load should also get better with a bigger cooler. My 2700x also has those and the bigger thermal mass of even "just" an NH-U14S cut around 10c off those.

The boxed coolers are just way too small for any kind of silent or even just quiet operation. They saturate with heat very quickly and need to ramp up speed almost instantly if you just open a webbrowser.
Also keep your bill or proof of purchase if you get a noctua. They send out free mounting kits for newer sockets if you have proof of purchase of a new system/Mainboard and the cooler.
They are definitely on the pricy side of things, but the future support and the good mounting system made it worth it in my case.
 
I personally think noctua gear is overpriced. I've heard very, very good things about the peerless assassin 120 se which is only $35 on Amazon.

801172_1675020031557.png

Competes quite well against the much more expensive D15. If you can fit it, they have the 7 pipe 140mm Frost Commander. The bigger and more overkill you go, the quieter you can run.
 
Last edited:
I personally think noctua gear is overpriced. I've heard very, very good things about the peerless assassin 120 se which is only $35 on Amazon.

View attachment 4760626

Competes quite well against the much more expensive D15
I bet it doesn’t have sexy beige and brown fans though. Cooling is very much a case of you get what you pay for.
 
  • Like
Reactions: Bananadana
Back