GPUs & CPUs & Enthusiast hardware: Questions, Discussion and fanboy slap-fights - Nvidia & AMD & Intel - Separe but Equal. Intel rides in the back of the bus.

Found an rx580 8gb for 120 usd on newegg. Will that work?
Price is a ripoff. Get one used from eBay, Craigslist, whatever if you want one. AMD is still providing up to date drivers for these cards which is a little surprising but I would not count on that lasting much longer as these are 7 year old cards.

The Linux support through Mesa on these cards is really good. I had an RX 570 in my 2021 Manjaro build and it never missed a beat. Really great card and ran contemporary autistic simulators like Transport Fever 2 at 1080p60. Struggled at 1440p so I went back to my GTX 1080 eventually and sold that card.
 
So, it seems a M4 MacBook Pro's turned up in Russia, somehow getting there from the Chinese production line.


According to the Russian Youtuber, it has the following:
Mac16,1
M4 10 core (4/6) CPU @ 4.41 GHz
16 GB RAM*
3 Thunderbolt 4 ports* (up from 2 Thunderbolt 3)
Available in space black

It's not sure if 16 GB's the base or not. And the Thunderbolt 4 ports would be a decent step up from TB3.
 
I'm looking to replace a 1060 6GB with something low-mid range. I'm currently running a Ryzen 5700X after upgrading from an i5-6600k last year. The highest spec game I want to play is Space Marine 2, which currently runs about 45-50fps. Monitor is 1080p and 75hz but I'll be looking into something with a higher refresh eventually. don't give a shit about 2K or 4K.

I've avoided the 4060 mainly due to hearing memes about how it's a worse card than even the 3060, but I honestly don't know how true that is. I've been tempted by the AMD stuff due to wanting to switch full time to Linux at some point, but Nvidia has treated me well.
 
I'm looking to replace a 1060 6GB with something low-mid range. I'm currently running a Ryzen 5700X after upgrading from an i5-6600k last year. The highest spec game I want to play is Space Marine 2, which currently runs about 45-50fps. Monitor is 1080p and 75hz but I'll be looking into something with a higher refresh eventually. don't give a shit about 2K or 4K.

I've avoided the 4060 mainly due to hearing memes about how it's a worse card than even the 3060, but I honestly don't know how true that is.
I've been tempted by the AMD stuff due to wanting to switch full time to Linux at some point, but Nvidia has treated me well.
XFX RX 6800 still looks like a really good deal (about 350$USD). I tried Nvidia on Linux recently but I couldn't get DLSS upscaling to work with Proton.
 
  • Like
Reactions: Skeletron Prime
I'm looking to replace a 1060 6GB with something low-mid range. I'm currently running a Ryzen 5700X after upgrading from an i5-6600k last year. The highest spec game I want to play is Space Marine 2, which currently runs about 45-50fps. Monitor is 1080p and 75hz but I'll be looking into something with a higher refresh eventually. don't give a shit about 2K or 4K.

I've avoided the 4060 mainly due to hearing memes about how it's a worse card than even the 3060, but I honestly don't know how true that is. I've been tempted by the AMD stuff due to wanting to switch full time to Linux at some point, but Nvidia has treated me well.
you can get a used 3060 for less than 300 these days on ebay, less if you go for the 8gb model, but i'd recommend the 12gb. this assuming you're first worlder
 
I've avoided the 4060 mainly due to hearing memes about how it's a worse card than even the 3060, but I honestly don't know how true that is.

It's more powerful than the 3060 and outperforms it in games. Frame generation adds even more. At this point, the 3060 is already 3 years old.

I wouldn't get an AMD card unless FSR4 turns out to not be shit. Although FSR frame generation is actually pretty good, I've found.
 
I've avoided the 4060 mainly due to hearing memes about how it's a worse card than even the 3060, but I honestly don't know how true that is. I've been tempted by the AMD stuff due to wanting to switch full time to Linux at some point, but Nvidia has treated me well.
4060 was slightly worse than the 3060 Ti which was the real sticking point for people as it didn't feel appropriately powerful for its 'spot' on the stack and the popular sentiment was that it felt more like a -50 rather than -60 tier card.

But also this entire scene is full of retarded children without any real sense of history. A few slop games on ultra settings are bumping up against the 8 GB of VRAM the 4060 has but there's no indication that it's a trend that's going to imminently doom your 8GB cards and in my experience game reqs are just as likely to stagnate as they are to surge, especially as the PS5 is weaker than a 4060 and the 3060/4060 are Steam's top GPUs currently.

One thing to keep in mind for your situation is that even low-end cards these days take 8-pin instead of 6-pin PCIE power so you should make sure you actually have an 8-pin connection from your PSU available. IIRC the 1060 is a 6-pin card so even a 4060 isn't strictly-speaking a drop-in replacement.
 
One thing to keep in mind for your situation is that even low-end cards these days take 8-pin instead of 6-pin PCIE power so you should make sure you actually have an 8-pin connection from your PSU available. IIRC the 1060 is a 6-pin card so even a 4060 isn't strictly-speaking a drop-in replacement.
Does it really matter? These connectors are overbuilt to the point where you can run more than double the rated power and not have any issues
 
4060 was slightly worse than the 3060 Ti which was the real sticking point for people as it didn't feel appropriately powerful for its 'spot' on the stack and the popular sentiment was that it felt more like a -50 rather than -60 tier card.

But also this entire scene is full of retarded children without any real sense of history. A few slop games on ultra settings are bumping up against the 8 GB of VRAM the 4060 has but there's no indication that it's a trend that's going to imminently doom your 8GB cards and in my experience game reqs are just as likely to stagnate as they are to surge, especially as the PS5 is weaker than a 4060 and the 3060/4060 are Steam's top GPUs currently.

One thing to keep in mind for your situation is that even low-end cards these days take 8-pin instead of 6-pin PCIE power so you should make sure you actually have an 8-pin connection from your PSU available. IIRC the 1060 is a 6-pin card so even a 4060 isn't strictly-speaking a drop-in replacement.

The other thing is that basically all the reviewers disable inferencing-based features to do their comparisons...which is the marquee feature of both 30 and 40 series. 30 series can't do frame generation at all. So if you've got a game that runs at 60 fps on both a 4060 and a 3060 Ti, you can flip on FG with the 4060 and get a substantially smoother experience.
 
  • Informative
Reactions: Gog & Magog
Since we're talking about 3060s, I am building another workstation, and thinking about what GPU I want to put in it. My main franken-workstation has a 12GB 3060 which seems to work well after upgrading from an experiment of running two 1050 Tis. Both systems I'm driving 2-4 2k displays, and need to do multi-display remote desktop and screensharing, sometimes simultaneously (so lots of simultaneous real-time h264 encode/decode). I don't really play video games at all, and if I do it's shit like OpenTTD, so don't need crazy gaming performance. I've got a spare 3090, but it's power hungry and stupidly large. I've also got an RX6400 which I didn't read the fine print and it seems to have no fucking h264 encoding support. Is the 3060 the best bang for the buck for good encode/decode performance and display driving, or is there something like a quadro variant that doesn't cost an arm and a leg that might also work (or should I look at AMD or Intel cards)? I have no problem with used cards.

Also what the fuck happened to 75W GPUs? I kind of miss having decent low power devices like the 1050 TI and 1650.
 
The other thing is that basically all the reviewers disable inferencing-based features to do their comparisons...which is the marquee feature of both 30 and 40 series. 30 series can't do frame generation at all. So if you've got a game that runs at 60 fps on both a 4060 and a 3060 Ti, you can flip on FG with the 4060 and get a substantially smoother experience.

Personally I find generating fake frames to be fake and gay. If you can't generate an actual 60 fps then enabling FG will make the game feel way worse and there is no point in enabling it if you can generate an actual 60 fps.
 
Personally I find generating fake frames to be fake and gay.
Everything every GPU has ever rendered is an optical illusion full of sleight of hand to both make the illusion more convincing and perform better - I don't understand why people have collectively decided that frame generation is the one technique that's "illegitimate".
 
Everything every GPU has ever rendered is an optical illusion full of sleight of hand to both make the illusion more convincing and perform better - I don't understand why people have collectively decided that frame generation is the one technique that's "illegitimate".
Latency issues, probably. Even on a terrible screen (59.whatever HZ) I can tell the difference when flicking my mouse around in first person (assuming v-sync is off and screen tearing is on). And that's comparing 50-60 HZ vs 100+, if frame generation only gets you up to 60 it'll be worse than that.
 
If you can't generate an actual 60 fps then enabling FG will make the game feel way worse and there is no point in enabling it if you can generate an actual 60 fps.

You've obviously never used it. If you can generate an actual 60 fps, FG will boost that to 90+ fps. If you can generate an actual 90 fps, FG will boost that to 120+ fps. I use AMD's version in Helldivers II to go at about 120 fps over the ~75 fps the game does without it. It definitely does not feel "way worse." It feels "way better." Best thing is, AMD's tech works on any Radeon from the last couple gens, not just the newest cards.

Everything every GPU has ever rendered is an optical illusion full of sleight of hand to both make the illusion more convincing and perform better - I don't understand why people have collectively decided that frame generation is the one technique that's "illegitimate".

FG isn't philosophically much different than motion blur, just another technology for making video look smoother. I remember people who had video cards that couldn't do motion blur seethed about how stupid it was; now it's in tons of games.
 
  • Like
Reactions: Brain Problems
if frame generation only gets you up to 60 it'll be worse than that.
Frame gen is entirely dependent on where you are starting at FPS wise... if you enable frame gen at 30 fps to bump up to around 50 fps... your input latency will feel around 30 fps still, which will be shit.. FG is essentially a technology that helps gamers push high refresh rate monitors while still using higher settings. As long as you are comfortable with whatever your input latency is pre FG, then you will have nothing to complain about.
 
Last edited:

Also what the fuck happened to 75W GPUs? I kind of miss having decent low power devices like the 1050 TI and 1650.
The latest ones are:
Nvidia RTX 3050 6GB
Intel Arc A380 (6 GB)
AMD Radeon RX 6400 (4 GB, 53W, turd)

AMD pushed the RX 6500 XT to 107W to make up for the bad die (PCIe 4.0 x4 which downgrades to PCIe 3.0 x4 in commonly used older systems, 4 GB VRAM with a couple 8 GB models, and nerfed video decode/encode because Navi 24 was intended for laptops).

Intel's low-power cards are well-regarded for handling video, but the drivers have been crap for gaming, although they have improved over time. The RTX 3050 6GB has a bad reputation for cutting a large chunk of cores and performance from the RTX 3050 8GB, but it's mostly just down to a bad price, around $170-180 right now. If it was significantly cheaper it could be legendary.

Pro cards:
NVIDIA RTX A1000 (8 GB)
NVIDIA RTX A2000 (6/12 GB variants)
NVIDIA RTX 2000 Ada Generation (16 GB)
NVIDIA RTX 4000 SFF Ada Generation (20 GB)
Intel Arc Pro A40 (6 GB, 50W)
Intel Arc Pro A50 (6 GB)
AMD Radeon PRO W6400 (4 GB, 50W, pro version of RX 6400)
AMD Radeon Pro W7500 (8 GB)

Nvidia is going hard on ~75W workstation cards, but with pretty high prices. 16-20 GB is useful for AI.

What's next? Intel could release Battlemage GPUs including another crack at the low-end. Nvidia's Blackwell is coming, but probably not another 75W card unless it's for workstations. AMD's Radeon Pro W7500 is exactly what an RX 7500 based on Navi 33 should look like, which would be much better than the trashy RX 6400 and RX 6500 XT based on Navi 24. There have been leaks related to it but who knows when/if it will come out.

 
Last edited:
You've obviously never used it. If you can generate an actual 60 fps, FG will boost that to 90+ fps. If you can generate an actual 90 fps, FG will boost that to 120+ fps. I use AMD's version in Helldivers II to go at about 120 fps over the ~75 fps the game does without it. It definitely does not feel "way worse." It feels "way better." Best thing is, AMD's tech works on any Radeon from the last couple gens, not just the newest cards.

Re-read what I said. I said that below an actual 60 fps framerate that it will feel worse. I did not say that enabling it at a higher real framerate was going to make the game feel worse. I only said it was pointless to enable it at higher framerates.

FG isn't philosophically much different than motion blur

Well, you just gave another good reason not to bother with FG
 
Back