GPUs & CPUs & Enthusiast hardware: Questions, Discussion and fanboy slap-fights - Nvidia & AMD & Intel - Separe but Equal. Intel rides in the back of the bus.

Sounds like @DavidS877 is running a server. Weak CPU + 100 TB of storage isn't a gaming machine.
A server? Like just one?
And to be fair, 100T(usable) is what I expect to need when I can afford to move to all SSD which may be a while based on how slow prices are dropping.
 
A server? Like just one?
And to be fair, 100T(usable) is what I expect to need when I can afford to move to all SSD which may be a while based on how slow prices are dropping.
Got a customer with multiple 1 PB servers.
 
Windows is a better PC operating system than Linux if you aren't a raging autist that wants to pretend your 1970's mainframe esque OS is somehow built for real home computing.
Windows is also a 1970s mainframe esque OS by your definition, considering that it has its roots in VMS.

Got a customer with multiple 1 PB servers.
Extremely based.
 
  • Like
Reactions: seri0us
I hate Frank Azor sm bros
He's not wrong though, and he hasn't said anything that @The Ugly One hasn't said. Really makes you think...

There are plenty of games that can cope with 8 GB at 1080p/1440p, Starfield being a recent notable example, and the esports titles he mentions. I've seen cheap 1080p 165-180 Hz displays at Walmart. There are 1080p 400-600 Hz displays made for the esports losers, that I guess the 9060 XT could reach in some older games, or with frame gen, or with the updated "Redstone" frame gen coming later in the year.

If you hate that, you go for the 9060 XT 16 GB. That card will be scalped to $400 immediately, but could fall back sooner than the 9070 series. The die size is 199mm^2, 56% the size of Navi 48, and a little smaller than Navi 33 (7600 XT).

The good news is that this should be the final generation for mainstream 8 GB cards. AMD and Intel (if they make desktop Celestial) will move to GDDR7, and 3 GB GDDR7 modules should be easy to get. So everybody can put 12 GB on 128-bit, with 8 GB reserved for only the lowest of the low.
 
  • Agree
Reactions: Combustion Engine
  • Informative
Reactions: Max Weber
Let me introduce you to a fun article and website:
Blur Busters: Blur Busters Law: The Amazing Journey To Future 1000Hz Displays (archive)
https://blurbusters.com/

I wouldn't go out and spend $1000+ on a 500+ Hz display. But there's something to it. If we all end up with these refresh rates, 4x frame generation could be a "cheap" way to fill in the frames.
frame gen effectively creates input lag so this is just a very autistic way to spend a lot of money on snake oil

Bro it's missing basic shit like a TPM. Your upgrade path is zero. Such is life.
your bloodstream is missing basic shit like the covid vaccine. get yourself up to date, youre an embarrassment.
 
frame gen effectively creates input lag so this is just a very autistic way to spend a lot of money on snake oil
My understanding is that it's less of a problem if the initial frame rate is already high (such as hypothetically, going from 250 to 1000). Which is part of why frame gen isn't very useful for people who don't already have a 4090/5090.
 
My understanding is that it's less of a problem if the initial frame rate is already high (such as hypothetically, going from 250 to 1000). Which is part of why frame gen isn't very useful for people who don't already have a 4090/5090.
that makes it not very useful period, theres no way any sane person would agree its worth spending thousands to feed your 240+ hz monitor (going above that is already just dick measuring) with fake frames. you cant even use the "muh esports need rapid hand eye coordination" excuse people usually bring up. its like audiophile retards telling you you have to spend $15k on a tube amp to add that pleasant creamy warmth and mellow roundness to the 30+ khz superharmonics or whatever shit that even your dog cant hear
 
Actual mental illness. Surely this is well into the diminishing returns territory.
top level competitive players of games like counter strike can actually feel the difference, these monitors are mostly for those people
 
frame gen effectively creates input lag so this is just a very autistic way to spend a lot of money on snake oil

The additional time it takes to compute and draw an extrapolated frame is about 10 ms, or 1/20 the time it takes a signal to travel from your retina to your brain and down to your fingers, depending on how old you are. Since you already experience the world with about 200ms latency, depending on how old you are (I'm on the high side *sigh*) , your brain's evolved to be very good at syncing up information and fooling you into thinking you're engaging with the world in real time. An extra 10ms is no big deal for it.

that makes it not very useful period, theres no way any sane person would agree its worth spending thousands to feed your 240+ hz monitor (going above that is already just dick measuring) with fake frames.

I like it, it makes things look smoother and better, especially in shooters, where the difference between 70 fps and 140 fps is noticeable. Since AMD's frame Fluid Motion Frames works on almost any DX12 game now, and I have a 162 Hz monitor, I keep games locked at 80 fps and turn on FMF. And since I'm not a mongoose, it doesn't affect my gaming at all other than making it more pleasant.

top level competitive players of games like counter strike can actually feel the difference, these monitors are mostly for those people

Even for retards like me, there's a visible difference in turn smoothness at very high frame rates. Not that it stops me from playing 1990s games that are locked at 25 fps.

There are plenty of games that can cope with 8 GB at 1080p/1440p, Starfield being a recent notable example, and the esports titles he mentions.

The PS5 has 16 GB of RAM total, and will, based on past trends, be the lead platform for nearly all games until about 2030. Moreover, PC game devs these days typically shoot to run at 60 fps low on 7-8-year-old cards, and 12-16 GB wasn't commonly available until 2023's 40 series. So realistically, if you have an 8 GB card, it will be about 5 years before games start coming out that just plain run like shit no matter what you do.,
 
Last edited:
I like it, it makes things look smoother and better, especially in shooters, where the difference between 70 fps and 140 fps is noticeable. Since AMD's frame Fluid Motion Frames works on almost any DX12 game now, and I have a 162 Hz monitor, I keep games locked at 80 fps and turn on FMF. And since I'm not a mongoose, it doesn't affect my gaming at all other than making it more pleasant.
sure, these are reasonable numbers. anyone can tell the difference between 60 hz and 160 hz. which is why i specifically mentioned going above 250 hz which is where you run headfirst into diminishing returns and thats even with real frames, let alone framegen.
 
Look, they need a 4K copy of every episode of Pokemon in every language its ever been dubbed in for research purposes.
You joke, but storage systems for media streaming services are colossal.
sure, these are reasonable numbers. anyone can tell the difference between 60 hz and 160 hz. which is why i specifically mentioned going above 250 hz which is where you run headfirst into diminishing returns and thats even with real frames, let alone framegen.
This is just my retarded theory, so take it with a grain of salt. In anything pros operate at a completely different level than the rest of you. NFL longsnappers, for example, can snap the ball precisely so that it lands in the punter's hands laces up or laces down, depending on what the punter wants, and lace facing affects the punt. Pro soccer players get so good at taking into account the aerodynamics of the ball that the league changes the stitching pattern every few years to keep them on their toes. On top of that, the best players are genetic freaks, able to regrow soft tissue at inhuman rates, accumulate muscle mass faster than 99.9% of the population, etc.

Add in that some of them sit almost with their noses on their monitors.
1748175255513.webp

Then consider that what graphics are really doing is tricking your brain into thinking it's seeing something real, not tiny squares lighting up on a panel. Since reality doesn't have a frame rate, and neither do your retinas, my theory here is that at very very high frame rates, your brain drowns more deeply in the illusion that what's on the screen is real and is spending less effort compensating for the skips between frames, and this gives the top-end gamers, who likely well have freakish genetics when it comes to reflexes and perception, and have devoted their entire lives to developing a skill that gets them no bitches instead of baseball, a tiny, tiny edge.

Of course, for the rest of us, it's just a placebo.
 
Stress testing my potato PC as I've been running into BSoDs and games just shutting down at random

stressing the RAM does nothing
stressing the CPU does nothing
stressing both results in an immediate error and PC crash

Screenshot 2025-05-25 141959.webp


is there a specific test I could throw at it to pinpoint the cause?
 
  • Thunk-Provoking
Reactions: Betonhaus
Stress testing my potato PC as I've been running into BSoDs and games just shutting down at random

stressing the RAM does nothing
stressing the CPU does nothing
stressing both results in an immediate error and PC crash

View attachment 7408966

is there a specific test I could throw at it to pinpoint the cause?
y-cruncher with fftv4 is my favorite for finding memory instability in short order.
If it fails then drop ram frequency to jedec and see if it still fails.

Just because memory passes memtest doesn't mean that it will be stable when memory controller gets hammered under heavy load.
 
Then consider that what graphics are really doing is tricking your brain into thinking it's seeing something real, not tiny squares lighting up on a panel. Since reality doesn't have a frame rate, and neither do your retinas, my theory here is that at very very high frame rates, your brain drowns more deeply in the illusion that what's on the screen is real and is spending less effort compensating for the skips between frames, and this gives the top-end gamers, who likely well have freakish genetics when it comes to reflexes and perception, and have devoted their entire lives to developing a skill that gets them no bitches instead of baseball, a tiny, tiny edge.

Of course, for the rest of us, it's just a placebo.
thing is, i dunno if youve ever been into esports, but i really cant think of a single game where shit like this would matter even at the peak professional level. pro players who can reach the absolute top just on pure mechanical skill are extremely rare exceptions. most pros are successful because they have a solid foundation of raw skill combined with excellent strategic and tactical knowledge of the game. i think this applies to every esport whether its a moba, an fps, an rts or anything.

most of my own competitive gaming experience is with fps shooters and a huge part of what makes it so you can kill the other guy before he can kill you is game sense and prediction, i.e. knowing when and where hes gonna appear, not millisecond-instant reactions to something that has already happened. this whole idea that getting a 10 khz polling rate mouse or a 600 hz monitor gives you an actual advantage is just bullshit i think. at this point youre fighting against the inherent latency that even a local ethernet connection has.

obviously if youre a successful pro who wins esport tournaments for a living and makes millions, sure go buy a 2000 hz monitor for $10k, its not like its gonna make you worse at the game. but for anyone "normal" who has a limited amount of money in their bank account this technology is absolutely not worth the cost and probably never will be unless some brand new future display technology either makes the concept of refresh rates obsolete or makes it trivial to have 500 hz displays (and also to achieve 500 fps without having to generate 70% of those frames)
 
Stress testing my potato PC as I've been running into BSoDs and games just shutting down at random

stressing the RAM does nothing
stressing the CPU does nothing
stressing both results in an immediate error and PC crash

View attachment 7408966

is there a specific test I could throw at it to pinpoint the cause?
You could try down-clock the RAM and re-run the test.

When people were finding issues with their 13th gen Intel CPUs one way to bring them back was to downclock the RAM.
 
  • Like
Reactions: Lady Adjani
Back