SchizoDaemon
kiwifarms.net
- Joined
- Apr 17, 2026
I have a TDI. A 300TDI.Laughs in TDI.
Follow along with the video below to see how to install our site as a web app on your home screen.
Note: This feature may not be available in some browsers.
I have a TDI. A 300TDI.Laughs in TDI.
“A four that moves like an eight.”I have a TDI. A 300TDI.
A lot of people had hand me down 8/16 bit computers in the early 90s. People were using 16bit stuff all the way upto the mid-90s.He used an IIGS so he's ancient from my point of view.
View attachment 8901723
i mean... i had a windows 3.1 pc with a 486 or some shit when i was growing up but i don't say that "it was my first PC" when my mom put a couple games on the computer and i played them. Considering this nigga is talking about ICQ and 56k modems it's pretty obvious he was not using an apple IIGS or Amigas when they were contemporaryHe used an IIGS so he's ancient from my point of view.
View attachment 8901723
Finally at the point in life to get an Evo VIII and of course they quit making themI have a TDI. A 300TDI.
I only got a PC when I was 14, didn't have a computer of any sort before—friends and neighbors had computers or consoles that I would play on occasionally—but the timing was perfect as it was the time of Internet, multimedia, and 3-D gaming by then.I went from a DOS commodore to Windows 95 pentium. Before we upgraded I remember being envious of my friends and their 486s that could use VGA (oooohhhh ahhhh) whereas I was stuck in the slow lane with EGA. Sad!
Well who was laughing when I got to play Myst and the 7th Guest?!
Often like that unfortunately. A lot of second hand vehicle prices for anything decent is going up through the roof as well.Finally at the point in life to get an Evo VIII and of course they quit making them
I just wanted to say real quick that you should avoid using userbenchmark for anything serious. It's definitely a lolcow of hardware sites, and it's notorious for trying to make AMD stuff look bad and defend Intel when it screws up. It often chalks AMD's popularity up to "paid influencers and AMD's marketing team". That being said, one of the recent-ish Intel releases (might have been Raptor Lake) was bad enough that even they said they couldn't defend it.Would you guys consider replacing a 5950x with a 5800x3d?
I'm not really seeing much in the way of gains on userbenchmark but I know little about the site and its' reliability:
Sorry if this is a stupid question.
i'd just like to sperg out about this. the benchmarking software is actually useful for comparing the cpu/memory performance of specific builds against each other. So it's actually a good software for benchmarking if you're using it to benchmark your overclocks. The problem is their database is corrupted with bad info so it's completely worthless doing anything but comparing two specific computer builds. And they aren't even benchmarking GPUs so anything regarding a graphics card is bullshitI just wanted to say real quick that you should avoid using userbenchmark for anything serious. It's definitely a lolcow of hardware sites, and it's notorious for trying to make AMD stuff look bad and defend Intel when it screws up. It often chalks AMD's popularity up to "paid influencers and AMD's marketing team". That being said, one of the recent-ish Intel releases (might have been Raptor Lake) was bad enough that even they said they couldn't defend it.
Since UserBenchmark declines sponsorship, it has become the target of a smear campaign which intensified following improvements to the CPU effective speed index in July 2019. Billion-dollar brands can try to shut us down but they can’t change who we are, the clue is in our name. UserBenchmark serves users exclusively without corporate sponsorship or "free" samples.
It's literally just an aggregate of user-run benchmarks.I just wanted to say real quick that you should avoid using userbenchmark for anything serious. It's definitely a lolcow of hardware sites, and it's notorious for trying to make AMD stuff look bad and defend Intel when it screws up. It often chalks AMD's popularity up to "paid influencers and AMD's marketing team". That being said, one of the recent-ish Intel releases (might have been Raptor Lake) was bad enough that even they said they couldn't defend it.
For context, it once recommended readers purchase a Core i5-13600K over the Ryzen 7 9800X3D, asserting, and I quote, "Spending more on a gaming CPU is often pointless."
It actually isn't. There are people who work for the site to make write ups for hardware when you look at specific parts. A lot of these write ups heavily favor Intel even in situations where it doesn't make sense. An example of this is in their 3300X review where they criticize it for its core count by saying cores don't add to performance, but then say people should buy an i3-10100 because it has an integrated GPU because gamers need that over coresIt's literally just an aggregate of user-run benchmarks.
Depends on the CPU and the game. Some games are more CPU bound. The X3D CPUs can really shine in some of these instances. IIRC, Cyberpunk might be one of those games, but I can't remember for sure. The real crime there is recommending anyone buy Intel 13th gen when they're well known to suffer from horrible manufacturing issues that cause the CPUs to die over timeWhere's the lie? TH's own benchmarks show that expensive CPUs deliver nothing of value for 99% of gamers.
It's all presented to you as if the information is being calculated based on the massive database that you're given access to but once you start clicking through the charts they give you it's pretty obvious they're fudging the numbers.It's literally just an aggregate of user-run benchmarks.
It can get you to 144 fps...if you run at 1080p on a 4090 with raytracing off. Who buys a 4090 to run games at 1080p? This is precisely the sort of benchmark that had me saying, "Oh, I guess CPU doesn't matter much." I'm not buying a $1500 GPU to run games at 1080p settings just so I can watch my CPU run as hot as possible. I crank shit up until I'm in the neighborhood of 70-90 fps.Depends on the CPU and the game. Some games are more CPU bound. The X3D CPUs can really shine in some of these instances. IIRC, Cyberpunk might be one of those games,
but then say people should buy an i3-10100 because it has an integrated GPU because gamers need that over cores
In addition, unlike the 3300X, the 10100 also packs integrated graphics with Quick Sync hardware encoding, which alleviates the need for desktop users to purchase a separate GPU.

Even right now its claiming the 9070xt is "23%" worst than the 5070 ti, despite them trading blows, all because one their retarded test has a huge performance boost on the nvidia card. In the real word they trade blows in a majority of actual real world use cases.I just wanted to say real quick that you should avoid using userbenchmark for anything serious. It's definitely a lolcow of hardware sites, and it's notorious for trying to make AMD stuff look bad and defend Intel when it screws up. It often chalks AMD's popularity up to "paid influencers and AMD's marketing team". That being said, one of the recent-ish Intel releases (might have been Raptor Lake) was bad enough that even they said they couldn't defend it.
Of the top ten most popular games on steam right, 6/10 of them are esports titles that heavily rely on the CPU and the players expect higher frame rates than normal. Just because you are not playing them doesn't mean they don't exist.Where's the lie? TH's own benchmarks show that expensive CPUs deliver nothing of value for 99% of gamers.
Yeah, I could have remembered a better example than that. Here's the site downplaying the very real power draw issues the Core i9-11900K had. I think most reviewers at the time mentioned how much power it would draw along with how hot it would get. I think it serves what I was trying to get at a little better.100% accurate statement.

I agree. At the end of the day, I just want to point out that sites like this aren't very reliable in case someone asks about it. I hope I've sort of shown that.That said, there's a reason I don't read articles by these slapfighting retards who think Fortune 500 companies are their friends. They do the exact same thing TH does, except gargling a different company's nutz:
I don't see any reason to distrust their benchmarks, I doubt they have time to hand-sort every benchmark and cherry-pick from thousands of runs to add a -3% anti-AMD bias.I agree. At the end of the day, I just want to point out that sites like this aren't very reliable in case someone asks about it. I hope I've sort of shown that.
The 9070 XT has 25% less bandwidth than the 5070 Ti, and a lot of GPU workloads are bandwidth bound, so that tracks. In a compute bound non-inferencing load, they're pretty even. Bandwidth bound, inferencing, or raytracing, Nvidia spanks AMD.Even right now its claiming the 9070xt is "23%" worst than the 5070 ti, despite them trading blows, all because one their retarded test has a huge performance boost on the nvidia card. In the real word they trade blows in a majority of actual real world use cases.
56k is just the earliest point most Americans got online because the release of the first mass-market 56k modems in 1997 almost coincided with AOL's shift to a flat $20/mo fee (as opposed to pay-per-minute) which caused the population of internet users to 10x almost overnight. Even most boomers would have first gotten online right around then just because it was quite expensive prior to 1996/1997.i mean... i had a windows 3.1 pc with a 486 or some shit when i was growing up but i don't say that "it was my first PC" when my mom put a couple games on the computer and i played them. Considering this nigga is talking about ICQ and 56k modems it's pretty obvious he was not using an apple IIGS or Amigas when they were contemporary