GPUs & CPUs & Enthusiast hardware: Questions, Discussion and fanboy slap-fights - Nvidia & AMD & Intel - Separe but Equal. Intel rides in the back of the bus.

  • 🐕 I am attempting to get the site runnning as fast as possible. If you are experiencing slow page load times, please report it.
Im not in the US, so I wouldn't know. Considering this guys technical knowledge though, I think buying used is going to be a challenge.
The challenge on used depends on the components mostly. Anything with pins is usually a no-go, and all storage items too.

I'm not in the US either, I just assume people on here are unless they say otherwise.
 
  • Like
Reactions: harbringer883
How would I get more ram exactly? Sorry I am a tech retard when it comes to this. I have a Nitro Pacer Desktop PC.

You can just buy it on Amazon or Newegg. Desktop memory is standardized. Each type of memory (DDR4, DDR5, etc) has a standard size slot it fits into. You buy it, pop out the old memory, being careful to unfasten any clips that may be holding it in, then pop in the new memory. With your machine, this will work:

 
It's not 800, it's 1200 when online for 5 years is factored in.
Wait, what jewery is this? Do you mean they have to buy the hardware and then pay to use it?
It's because he's a class-A flaggot who DMCAs videos that are critical of him.
Care to link some of that? Or at least give a basic gestalt of the situation. I'm hungry for some technical drama.
 
Wait, what jewery is this? Do you mean they have to buy the hardware and then pay to use it?

Ten bucks a month to play online multiplayer with Xbox Game Pass Core, unnecessary for single player. They sweeten the pot with a few "free" games each month. Right now, it includes Gears 6, Doom 2016, Deep Rock Galactic, and Arkham Knight, among others. If you actually play a few of the games, it's worth it. Back when I was still on console, I easily got my money's worth out of the PS+ titles. But if all you want to do is play one game online for a year and GRIND DAT BATTLEPASS*, it adds up quick**.

*sold separately
**battlepass does not include additional premium bundles

Care to link some of that? Or at least give a basic gestalt of the situation. I'm hungry for some technical drama.

It seems to me, having watched a little more of his material, is that his biggest blind spot is not understanding the cost of a worker's time. He's a young guy trying to start his own company, and this is a frequent blind spot of young entrepreneurs in all industries. They will laugh at how stupid you are for not just doing this one simple thing to get $1000 of revenue, and this "one simple thing" gobbles up an entire work day of their own time. As a rule of thumb, I value an hour of an engineer's time at about $200. So in my book, you just spent $1600 to make $1000.

Anyway, so this guy is a pretty strong advocate of graphics techniques that require lots of hand-tweaking to get good results. One of his videos I watched, I don't remember which, he was praising a technique that performs great and looks great...just so long as you ensure none of your meshes have large triangles that cross the screen. Well, if I have two technologies in front of me, one where the engineer can drop in his models, run an auto-cleanup tool, and be done in 2 hours, and one where he spends a week chasing squirrels, I'm going with the first technology, because it costs me $400 to implement, and the latter costs me $8,000 to implement. I don't care at all that the second technology looks somewhat better, and an autistic man with a maginifying glass can show exactly why. I have a product to ship and a budget to meet, and a technology that costs me 20x more to use is a non-starter.

At the end of the day, while he's very knowledgeable and has a lot of valid criticism of Unreal Engine 5, he seems to be making a similar mistake that John Carmack made near the end of his career, which is not understanding that your #1 concern needs to be getting your product done on time and within budget.
 
Last edited:
Wait, what jewery is this? Do you mean they have to buy the hardware and then pay to use it?

Care to link some of that? Or at least give a basic gestalt of the situation. I'm hungry for some technical drama.
Sure.

His biggest problem is that he starts off with some valid technical ideas and then spins off into grand conspiracy theories about how UE5 and modern development are aligned to Keep Da Gamers Down. And then if you try to push back on him at all, he DMCAs you.

The reality is more like what @The Ugly One says - handtuning every single one of your assets in a game where you might have millions of individual assets is simply not economical. There's a lot to be said about things like ray-tracing and lumen and modern AA and how computationally expensive they are for very little gain but we can't 'go back' to the halcyon days where a studio could sit down and micro-optimize their DOS games. Even if the devs wanted to do that and spend 10 years to get a game out with hand-baked lighting or perfectly optimized models, the studios and publishers can't do that - most modern games struggle to make their money back anyway and you can't just tack on an additional 5 years of dev time when, if we're being honest, most people are going to update their hardware anyway even if they grouse about it.
 
I feel like that the PS3 and PS4 eras were both artificially and ridiculously long because of the great recession as well as COVID. So now that we're getting a graphics leap finally it's causing the young who don't know any different to start screeching about muh optimization because hardware requirments are going up finally.
 
  • Informative
Reactions: Prehistoric Jazz
So now that we're getting a graphics leap finally it's causing the young who don't know any different to start screeching about muh optimization because hardware requirments are going up finally.
This is part of it.

Another part of it is that a much larger segment of the PC gaming population comes from outside the first-world. If you're an American or western Euro or first-world East Asian, the prices on GPUs are already bad enough. But if you're in Mexico or Indonesia or Russia or Brazil, even a modest xx60-class GPU is now priced at 50-75% of your monthly income.

These are people who have been effectively priced out of modern gaming entirely after decades of being able to keep up with the first world Joneses and they're upset about it. And I can't really blame them for being upset that the industry, both in terms of software and hardware, has left them behind by basically saying, "lol just stop being poor"

The situation is something akin to when a decent F2P game goes entirely to shit just so it can milk even more money from whales.
 
Anyway, so this guy is a pretty strong advocate of graphics techniques that require lots of hand-tweaking to get good results. One of his videos I watched, I don't remember which, he was praising a technique that performs great and looks great...just so long as you ensure none of your meshes have large triangles that cross the screen. Well, if I have two technologies in front of me, one where the engineer can drop in his models, run an auto-cleanup tool, and be done in 2 hours, and one where he spends a week chasing squirrels, I'm going with the first technology, because it costs me $400 to implement, and the latter costs me $8,000 to implement. I don't care at all that the second technology looks somewhat better, and an autistic man with a maginifying glass can show exactly why. I have a product to ship and a budget to meet, and a technology that costs me 20x more to use is a non-starter.
The reality is more like what @The Ugly One says - handtuning every single one of your assets in a game where you might have millions of individual assets is simply not economical. There's a lot to be said about things like ray-tracing and lumen and modern AA and how computationally expensive they are for very little gain but we can't 'go back' to the halcyon days where a studio could sit down and micro-optimize their DOS games. Even if the devs wanted to do that and spend 10 years to get a game out with hand-baked lighting or perfectly optimized models, the studios and publishers can't do that - most modern games struggle to make their money back anyway and you can't just tack on an additional 5 years of dev time when, if we're being honest, most people are going to update their hardware anyway even if they grouse about it.
I was more looking for specific episodes of online tard fights, but posts tho.

This reminds me of ray tracing workflows and the GI shortcuts to save on render time. Back then, you tuned the parameters across several text boxes to tell the ray tracer how you want shit to be rendered. As throwing more compute power at the problem became more and more viable, just using brute force GI also become more and more popular.

Some lore related to this is back in the early 2010s, Arnold render wasn't available to the public. All the information about it was just artists in FX houses praising it for speed and ease of use. Eventually they released a demo of it to the public which can plug into Maya. What people found was that Arnold actually rendered slower than say MentalRay or Vray, but it was much easier to set up. Also it didn't choke on large scenes.

It was, however, much simpler to use. Less number boxes than Vray, but with enough CPU power, it could render faster. The thing they said about it back then was "CPU time is cheap, artist time is expensive."
 
  • Informative
Reactions: The Ugly One
I was more looking for specific episodes of online tard fights, but posts tho.
He's on the Unreal Engine forums under the name TheKJ. I'm at work right now so I don't have a ton of time to dig for gold but here's some silver that fell out when I was searching through his post history
1747322579785.webp
 
I was more looking for specific episodes of online tard fights, but posts tho.

This reminds me of ray tracing workflows and the GI shortcuts to save on render time. Back then, you tuned the parameters across several text boxes to tell the ray tracer how you want shit to be rendered. As throwing more compute power at the problem became more and more viable, just using brute force GI also become more and more popular.

Some lore related to this is back in the early 2010s, Arnold render wasn't available to the public. All the information about it was just artists in FX houses praising it for speed and ease of use. Eventually they released a demo of it to the public which can plug into Maya. What people found was that Arnold actually rendered slower than say MentalRay or Vray, but it was much easier to set up. Also it didn't choke on large scenes.

It was, however, much simpler to use. Less number boxes than Vray, but with enough CPU power, it could render faster. The thing they said about it back then was "CPU time is cheap, artist time is expensive."

The problem for future game dev is that it's cheaper on the dev to "Put light sources here" then let the GPU do the work. Manually setting the lighting is a lot more work.
 
The reality is more like what @The Ugly One says - handtuning every single one of your assets in a game where you might have millions of individual assets is simply not economical. There's a lot to be said about things like ray-tracing and lumen and modern AA and how computationally expensive they are for very little gain but we can't 'go back' to the halcyon days where a studio could sit down and micro-optimize their DOS games. Even if the devs wanted to do that and spend 10 years to get a game out with hand-baked lighting or perfectly optimized models, the studios and publishers can't do that - most modern games struggle to make their money back anyway and you can't just tack on an additional 5 years of dev time when, if we're being honest, most people are going to update their hardware anyway even if they grouse about it.

I see it as something that's been happening for almost 30 years now, going all the way back to Descent using vertex lighting from in-game sources rather than hand-tweaked sectors. There were things you could do with Doom's lighting that you couldn't with Descent's, like make dramatic shadows. But with Descent, you just put light sources where you wanted, and there you go.

With the latest tech, raytracing, rather than patching together and tweaking several different reflection technologies, different approaches to static and dynamic shadows, fiddling with ambient occlusion hacks in each area to make them all work, etc, with raytracing, it's all unified together. Now, I'm sure this kid here could put together a video proving that you could produce something that looks almost exaclty like the path-traced version of this scene using the Source engine and a variety of cleverly-placed light probes, effects maps, prebaked shadow maps, and so on, and still achieve 100 fps. But the thing is, it didn't take CD Projekt Red months and months of work, hand-tweaking assets and tuning localized effects one by one across every region of the game to achieve this look. It was far less work than that.

1747323740629.webp

I feel like that the PS3 and PS4 eras were both artificially and ridiculously long because of the great recession as well as COVID. So now that we're getting a graphics leap finally it's causing the young who don't know any different to start screeching about muh optimization because hardware requirments are going up finally.

I think it's because we have seen basically no new game types at all enabled by the last generation of hardware. The PS4 could do some things at scale the PS3 didn't, like Battlefield with 64 players and not running like absolute shit. But examples were few and far between. From PS4 to PS5, I can't think of even one example. If technology froze forever in 2012, the only thing that would really change is how pretty games are, not what games are actually being made.
 
I think it's because we have seen basically no new game types at all enabled by the last generation of hardware. The PS4 could do some things at scale the PS3 didn't, like Battlefield with 64 players and not running like absolute shit. But examples were few and far between. From PS4 to PS5, I can't think of even one example. If technology froze forever in 2012, the only thing that would really change is how pretty games are, not what games are actually being made.
I'm happy to see stuff like Clair Obscure and other AAs start displacing AAA games. MBAs have ruined the AAA studios.
 
it adds credibility to Threat Interactive in terms that he knows what he is talking about.
I've seen a lot of people say he doesn't know what he's talking about, but I've never seen somebody make anything substantial. It's always something tangential.

Forgot to mention that I've also seen people say that he went to the graphics programming discord server and said that he doesn't know any graphics programming, but I haven't seen any screenshots myself.
 
Last edited by a moderator:
Back