GPUs & CPUs & Enthusiast hardware: Questions, Discussion and fanboy slap-fights - Nvidia & AMD & Intel - Separe but Equal. Intel rides in the back of the bus.

All my parts finally arrived today. Is it normal to want to commit suicide during a PC build? Everything is going wrong :mad:
if you ever had to built a pc during a time you had to deal with socket 7 coolers and cases trying to cut off your finger at every turn, you haven't seen shit.

EDIT: Yeah as smaug said go to bios and set XMP, modern CPUs love their ram running at good speeds and you can lose a lot of performance from not having your ram running at it's max.
while true, the first ryzen were also extremely picky about ram, XMP was mostly an intel thing that usually worked right out of the box there. even brand sticks didn't always work.
it got better with later ryzen and agesa updates, but in the beginning it was a pain in the ass.
 
Last edited:
Finally DPD relented and I got my CPU, got the new PC all set up which is great now I just need to port over all my old shit onto this nice new SSD.
Get to test out my I5 2400f+ RTX 3060 (Got secondhand for £300 on ebay only used for four months!) on some bideo games.
 
Finally DPD relented and I got my CPU, got the new PC all set up which is great now I just need to port over all my old shit onto this nice new SSD.
Get to test out my I5 2400f+ RTX 3060 (Got secondhand for £300 on ebay only used for four months!) on some bideo games.
Don't get so excited, you'll end up playing Heroes of Might and Magic 3 like the rest of us.
 
Well I got the RX 580, I knew going in that it's not exactly the most efficient GPU there is but the data really isn't great. Measured at the wall about 20-25W higher power consumption doing normal desktop things vs. the 4650G Pro iGPU. Fan spins up and turns off at random times which is really annoying but I guess just how the GPUs' BIOS temperature curves are set. Playing back a video the power consumption shoots up to about 75W, which is vs. ~29W with the iGPU, also video playback without annoying fan noises isn't possible. What irks me most is that normal desktop operation is so inefficient. I tried tweaking the power states a bit but it doesn't really make a whole lot of difference. I admit that the case isn't ideal for this graphics card and a bigger, better ventilated case could probably pull off zero fan operation in desktop mode. (I tried forcing the GPU fans off but that just makes the card overheat very slowly) The power consumption still would be much worse. I got really used to my entirely silent iGPU system that sat in a nice corner, temperature-wise and even in the games it ran well merely sipped energy. (playing e.g. Stellaris was a 35W deal, also completely quiet) This just feels very inefficient and crude in comparison. The games I wanted to play it runs very well but everything else is just more noise and heat and costs with no real upside. Also looking around for similar experiences this high power consumption with the card doing nothing really might actually be a linux problem, seems like the kernel people care even less about superfluous power consumption and heat than AMD does. I could live with the added power consumption, but random fan noises doing normal desktop stuff and light gaming/emulation after the paradise that was the iGPU? No deal.

One way to deal with this could be to just pcistub the card in linux and just activate it with a VM or similar passhrough for the specific use cases I have and for everything else, use the iGPU. That just feels like a lot of added complexity though. My guess is I'm going to sell it and just wait until iGPUs get where I need them. I feel that won't take long to begin with.
 
  • Feels
Reactions: Judge Dredd
Well I got the RX 580, I knew going in that it's not exactly the most efficient GPU there is but the data really isn't great. Measured at the wall about 20-25W higher power consumption doing normal desktop things vs. the 4650G Pro iGPU. Fan spins up and turns off at random times which is really annoying but I guess just how the GPUs' BIOS temperature curves are set. Playing back a video the power consumption shoots up to about 75W, which is vs. ~29W with the iGPU, also video playback without annoying fan noises isn't possible. What irks me most is that normal desktop operation is so inefficient. I tried tweaking the power states a bit but it doesn't really make a whole lot of difference. I admit that the case isn't ideal for this graphics card and a bigger, better ventilated case could probably pull off zero fan operation in desktop mode. (I tried forcing the GPU fans off but that just makes the card overheat very slowly) The power consumption still would be much worse. I got really used to my entirely silent iGPU system that sat in a nice corner, temperature-wise and even in the games it ran well merely sipped energy. (playing e.g. Stellaris was a 35W deal, also completely quiet) This just feels very inefficient and crude in comparison. The games I wanted to play it runs very well but everything else is just more noise and heat and costs with no real upside. Also looking around for similar experiences this high power consumption with the card doing nothing really might actually be a linux problem, seems like the kernel people care even less about superfluous power consumption and heat than AMD does. I could live with the added power consumption, but random fan noises doing normal desktop stuff and light gaming/emulation after the paradise that was the iGPU? No deal.

One way to deal with this could be to just pcistub the card in linux and just activate it with a VM or similar passhrough for the specific use cases I have and for everything else, use the iGPU. That just feels like a lot of added complexity though. My guess is I'm going to sell it and just wait until iGPUs get where I need them. I feel that won't take long to begin with.
Just underclock it and maybe undervolt it. The RX580 runs hot because it's an overclocked 480, which ran pretty hot itself. Make a profile for light use and one for heavy use(that resets the underclock) and switch between them as necessary.
 
Just underclock it and maybe undervolt it. The RX580 runs hot because it's an overclocked 480, which ran pretty hot itself. Make a profile for light use and one for heavy use(that resets the underclock) and switch between them as necessary.
That was a good hint. I looked at the power states of the card and noticed that while the GPU has 8 power states (and the memory 3) as soon as any graphics work has to be done it of course shoots to the highest state while not even considering anything in between. (a Linux classic) Locking it to the lowest state of 300 Mhz Core/300 Mhz RAM is plenty for all of desktop/video playback and also light gaming. I then deactivated automatic fan control and locked it to a very low, not-really-audible RPM. These two things make the card sit at a rock-solid 42C, no matter what (light) work it is doing. Now comes the experimenting with undervolting/clocking the states for gaming. My guess is that it just picks the highest state until it runs into either power (you can define a maximum power used) or thermal limitations. Helpfully, Linux doesn't let me set a lower voltage or clock than the lowest. I don't know if it's an artifical limitation in the kernel or if the firmware/bios doesn't allow for it. I guess if I really want to undervolt/clock that lowest state (and that's where the money is, so to speak) I'm going to have to edit the BIOS. Efficiency-wise the card still kinda sucks but the sucking has become less offensive, that is a start.
 
Doubleposting but I found a better solution for Linux - DRI Prime. Basically the amdgpu driver is loaded for both cards with the significantly less hungry iGPU as primary card. The rx580 in this setup is pretty much off and automatically sent by the driver into D3Hot when not needed where it basically consumes zero power and doesn't produce any heat; I can't really measure any power consumption. (~25W measured at the wall doing desktop stuff, as is normal with my 4650G setup, this includes the monitor, 40-55W it was with the dGPU as primary card, also a lot more heat) When I need the dGPU I just launch the program/game in question with the enviroment variable DRI_PRIME=1 and that's it, the program gets rendered on the dGPU and displayed via iGPU. There is probably some performance overhead and latency added, but I didn't notice it. This feature is mostly intended for hybrid graphics laptops but works just fine on desktops too, there's nothing hardware specific about it. The biggest difficulty was the appalingly poor documentation and tard-wrangling my BIOS to not turn the internal GPU off. Now I have the best of both worlds. Light gaming, browsing etc. on the iGPU, the heavy stuff on the dGPU, and switching is just a variable away.
 
Just underclock it and maybe undervolt it. The RX580 runs hot because it's an overclocked 480, which ran pretty hot itself. Make a profile for light use and one for heavy use(that resets the underclock) and switch between them as necessary.
Unvolting/underclocking are your friends.

My total rig with a RX5700 underclocked/volted uses under 300 watts when playing video games/video editing,etc.

I built my rig for performance and efficiency and if I do things right I can get that RX 6800, undervolt that and not stress out my PSU @ 50% load.

I sure as hell not going to buy a video card that turns into a space heater because of the high wattage use.
 
  • Like
Reactions: Brain Problems
Sorry if this is a dumb question: my new motherboard has an 8-pin slot marked CPU_PWR1. Unfortunately my 780W PSU dowsn't have an 8-pin connector, but it has multiple 4-pin connectors. Will putting 2 4-pin connectors side-by-side into the 8-pin slot blow it up?
 
Sorry if this is a dumb question: my new motherboard has an 8-pin slot marked CPU_PWR1. Unfortunately my 780W PSU dowsn't have an 8-pin connector, but it has multiple 4-pin connectors. Will putting 2 4-pin connectors side-by-side into the 8-pin slot blow it up?
Check the manual for the mobo. IIRC mine was same and the instructions said to plug one 4 pin plug into the left most position on the mobo and nothing blew up.
 
  • Informative
Reactions: Friendly Primarina
Sorry if this is a dumb question: my new motherboard has an 8-pin slot marked CPU_PWR1. Unfortunately my 780W PSU dowsn't have an 8-pin connector, but it has multiple 4-pin connectors. Will putting 2 4-pin connectors side-by-side into the 8-pin slot blow it up?
Shouldn't be a problem, on mine it's the same way, there two 2x2 connectors that slot together to become a 2x4 that goes into the CPU_PWR socket. Anyway, both the sockets and plugs are probably keyed(look closely at them) to avoid accidental electro-sodomy.
 
Many recent PSU's 2016 and up, do carry 2 sets a variable 6+2 and 4+4 pin connections . They were made for possible future updates for motherboards as well as video cards. But always check your Motherboard configuration and PSU for power usage and connectivity.

I've seen some stupid shit in my life so... heh so always check your your manufacture's instructions.
 
Can we ask for tips on builds here? I didn't see anything against it in the op.

I'm currently making the part list for my first pc that I intend to build around September and all my info comes to pigetsystems.com and random tech youtubers.
As you can imagine I'm more confused everyday.
I know intel since i have a old i7 in my laptop and 1080p video editing on it was a breeze, so at first I thought "I'll make a really nice 11th gen pc to save some money" but then I discovered that the gen jump from 11 to 12 is really big in terms of performance.
At that point I thought to build around a i7-12700 on a B660 mobo to take advantage of the new tech without going overboard for a wifi board.
And now youtube is recommending me Intel vs AMD videos.
So, I was sure about going intel for the igpu on the processor (I'm not a gamer and I'm not going to use after effect or anything 3d anytime soon,i just work on video and photos), but now YouTube is planting amd ideas in the back of my brain.

So I'm here now to ask some input from people that are not trying to sell me something with their sponsorships.
What are the real advantages of going amd for video editing? Cheaper and better motherboards are worth over losing the better video encoding/decoding?
 
If you're thinking about getting a 12th generation Intel with E-cores (such as the i7-12700 you mentioned) and you're using Windows you should look carefully into what versions are compatible. Windows 11 might be the only Windows that can schedule threads correctly on that kind of CPU.
AMD doesn't use this kind of architecture so it's not a problem.
 
So I'm here now to ask some input from people that are not trying to sell me something with their sponsorships.
What are the real advantages of going amd for video editing? Cheaper and better motherboards are worth over losing the better video encoding/decoding?
The upcoming Ryzen will all have an iGPU/Vega(so it's been said at least) so if you are willing to hold on for 2-3 months and see what those are like you might be in a better position to make The Best(tm) choice. I also don't know if I've even seen any video editors that straight up use the hardware encoder for exporting but it is also something that I have never tried to find out.

One benefit of Ryzen have been that they used the same socket so upgrading the CPU a couple of years down the line have been a breeze. Right now AM4 is being retired and if easy future proofing is something you are interested in then the upcoming AM5 will be your huckleberry. With Intel you don't have to worry about that, you can be 100% certain that you will have to toss the motherboard and reinstall everything if you want to upgrade the CPU to a newer generation after a couple of years.

Going AM5 also means DDR5 and that's what people are cranky about, the reasonable priced RAM right now doesn't really merit the extra cost over DDR4. Whatever you buy now will be a placeholder until the good stuff arrives at a reasonable price, then you have to toss the RAM that you have because if it is mixed with faster memory it will still run at the speed of the slowest memory in the system.

You have chosen a great time to navigate hardware and build your first computer. On the bright side, if you buy into the current Ryzen 5000 or Intel 12000 it will still be really good for your use cases so other than FOMO there's no reason to sweat it.
 
  • Informative
Reactions: Friendly Primarina
If you're thinking about getting a 12th generation Intel with E-cores (such as the i7-12700 you mentioned) and you're using Windows you should look carefully into what versions are compatible. Windows 11 might be the only Windows that can schedule threads correctly on that kind of CPU.
AMD doesn't use this kind of architecture so it's not a problem.
Can't believe I missed this up until your answer, thanks.
The upcoming Ryzen will all have an iGPU/Vega(so it's been said at least) so if you are willing to hold on for 2-3 months and see what those are like you might be in a better position to make The Best(tm) choice. I also don't know if I've even seen any video editors that straight up use the hardware encoder for exporting but it is also something that I have never tried to find out.

One benefit of Ryzen have been that they used the same socket so upgrading the CPU a couple of years down the line have been a breeze. Right now AM4 is being retired and if easy future proofing is something you are interested in then the upcoming AM5 will be your huckleberry. With Intel you don't have to worry about that, you can be 100% certain that you will have to toss the motherboard and reinstall everything if you want to upgrade the CPU to a newer generation after a couple of years.

Going AM5 also means DDR5 and that's what people are cranky about, the reasonable priced RAM right now doesn't really merit the extra cost over DDR4. Whatever you buy now will be a placeholder until the good stuff arrives at a reasonable price, then you have to toss the RAM that you have because if it is mixed with faster memory it will still run at the speed of the slowest memory in the system.

You have chosen a great time to navigate hardware and build your first computer. On the bright side, if you buy into the current Ryzen 5000 or Intel 12000 it will still be really good for your use cases so other than FOMO there's no reason to sweat it.
Yeah, waiting a couple of months is not a problem, so I guess I'll wait to see what AMD will be offering. Shelling the extra money for DDR5 is kind of a bummer, but i guess it will be offset by the better upgrade possibilities of the AM5 socket.
Re video editing: intel's igpu can work in tandem with the gpu taking on the live processing of the files, while the gpu takes on the effects, color correction and other heavy stuff
 
Re video editing: intel's igpu can work in tandem with the gpu taking on the live processing of the files, while the gpu takes on the effects, color correction and other heavy stuff
Oh, that sounds like OpenCL/CUDA work, I thought you were using the hardware (h264/h265) encoder to export the project into a video. Actual h264 encoding time is the least of my problems, rendering all the crap into frames that will be encoded into the video is what takes up time. And I will always prefer software encoding.
Intel's QuickSync have(or had, don't know if it has changed) a good reputation though, while AMD on the Radeon side did not fare as well. That was for streaming though which the hardware encoding seems to be targeted towards.
 
Socket 7 coolers
That reminds me of the nigh indestructible old cases that you could cut your hand every-whichway.
agesa updates
AMD seems to break something major every update for me that causes severe instability until I figure out a work around. It's also not very fun that their Ryzen line has that MWAIT lockup on the errata. I was quite upset when they had the segfault on compiling too in the 1st gens (I went through 3 RMAs).
while AMD on the Radeon side did not fare as well.
Have they ever? I'm almost 100% sure AMD keeps trying to stuff more retarded stuff into their user display then trying to fix underlying driver issues. To be fair though, Intel's ARC might give AMD a run for money in the "drivers broke pls send halp". The ones I've seen in China make Microsoft seem bugfree.

Rip optane though. I was interested for it in JFS and Reiser4.
 
Back