GPUs & CPUs & Enthusiast hardware: Questions, Discussion and fanboy slap-fights - Nvidia & AMD & Intel - Separe but Equal. Intel rides in the back of the bus.

Not well versed in GPUs so I have to ask. My everyday PC is a Gateway DX4831-01e w/Win 7 I bought several years ago from someone leaving the US. It has an i3-530 and 350W PS from an old Dell. It doesn't have a UEFI BIOS. I put an AMD Radeon HD 6570 in it that now is on its last legs. Until last week it was running fine.

What GPU can I get to replace it? I was thinking a GT 710 but I find any info on if it requires a UFI BIOS to recognize. I'm kind of lost as its an old PC but it's fine for everyday use and watching video. My other PC is getting the Win 10 infinite boot loop and until I can find a permanent solution to that I'm stuck with the Gateway.
 
  • Feels
Reactions: Smaug's Smokey Hole
Not well versed in GPUs so I have to ask. My everyday PC is a Gateway DX4831-01e w/Win 7 I bought several years ago from someone leaving the US. It has an i3-530 and 350W PS from an old Dell. It doesn't have a UEFI BIOS. I put an AMD Radeon HD 6570 in it that now is on its last legs. Until last week it was running fine.

What GPU can I get to replace it? I was thinking a GT 710 but I find any info on if it requires a UFI BIOS to recognize. I'm kind of lost as its an old PC but it's fine for everyday use and watching video. My other PC is getting the Win 10 infinite boot loop and until I can find a permanent solution to that I'm stuck with the Gateway.
I would bet not, since the GT 710 is one of those chips that Nvidia revived fairly recently to sell as low-power video options for business PCs. So there's a good chance that that card is too new to support legacy BIOS. I'd try to find something more period correct for that machine like something from the 200 or 400 series, or another Radeon HD 6000 series.
 
I would bet not, since the GT 710 is one of those chips that Nvidia revived fairly recently to sell as low-power video options for business PCs. So there's a good chance that that card is too new to support legacy BIOS. I'd try to find something more period correct for that machine like something from the 200 or 400 series, or another Radeon HD 6000 series.
I found the same card on ebay, hopefully it's it better shape than mine.
 
I would bet not, since the GT 710 is one of those chips that Nvidia revived fairly recently to sell as low-power video options for business PCs. So there's a good chance that that card is too new to support legacy BIOS. I'd try to find something more period correct for that machine like something from the 200 or 400 series, or another Radeon HD 6000 series.
I would say Nvidia has a overall better track record when it comes to legacy bios support, I have seen 980’s and 1060’s be used with some crusty C2Q’s builds.
 
Not well versed in GPUs so I have to ask. My everyday PC is a Gateway DX4831-01e w/Win 7 I bought several years ago from someone leaving the US. It has an i3-530 and 350W PS from an old Dell. It doesn't have a UEFI BIOS. I put an AMD Radeon HD 6570 in it that now is on its last legs. Until last week it was running fine.

What GPU can I get to replace it? I was thinking a GT 710 but I find any info on if it requires a UFI BIOS to recognize. I'm kind of lost as its an old PC but it's fine for everyday use and watching video. My other PC is getting the Win 10 infinite boot loop and until I can find a permanent solution to that I'm stuck with the Gateway.
Here's an article about fixing the boot loop.
 
  • Like
Reactions: Feline Supremacist
A few days ago I found a old Core 2 Quad in one of my drawers. I think it may have come out of an old Dell computer I had years ago. What should I do with it?
 
So reviews are coming in for the Radeon 6900XT. Beats out the 3090 in at least half the titles tested for approx. $500 less.

Unless you turn on Ray-tracing. Then it plummets.

Way I'm reading the current state of play is: if you don't care about ray-tracing and you don't need CUDA, get AMD. Otherwise Nvidia.

Nvidia do have DLSS which AMD don't have a solution out for yet, but it will come and be supported on current cards. I doubt it will be as good as Nvidia's but I think it will be "good enough".

I'm somewhat tempted to get a 6700 when they come out, but depends on the price. GPUs have become insanely expensive these days.
 
So reviews are coming in for the Radeon 6900XT. Beats out the 3090 in at least half the titles tested for approx. $500 less.

Unless you turn on Ray-tracing. Then it plummets.

Way I'm reading the current state of play is: if you don't care about ray-tracing and you don't need CUDA, get AMD. Otherwise Nvidia.

Nvidia do have DLSS which AMD don't have a solution out for yet, but it will come and be supported on current cards. I doubt it will be as good as Nvidia's but I think it will be "good enough".

I'm somewhat tempted to get a 6700 when they come out, but depends on the price. GPUs have become insanely expensive these days.

Good luck with either of these. The Big Navi cards are even more vaporware than Ampere.

Apparently Fidelity FX, which is their response to DLSS, is already implemented in hardware, they just need to get it done properly in software. Personally if I were AMD I would have prioritised that over ray tracing because ray tracing is still a meme right now, but most normies can understand what ray tracing is with side by side screenshots while DLSS is harder to sell.

But, yeah, RTX 3090 is looking increasingly stupid.
 
  • Informative
Reactions: Overly Serious
But, yeah, RTX 3090 is looking increasingly stupid.

Yeah. I said it before but I'll say it again - Nvidia have been tripped up by their own marketing. The 3090 is an excellent prosumer card for rendering and ML and I think that's what it was designed for - hence the VRAM that is far more than can actually be useful for gaming. But at some point they felt they had to market it as this ultimate gaming card which it ain't.

Keep the price and say it's for graphics artists and home animators and you instantly have a great product. But they ain't doing that.
 
Yeah. I said it before but I'll say it again - Nvidia have been tripped up by their own marketing. The 3090 is an excellent prosumer card for rendering and ML and I think that's what it was designed for - hence the VRAM that is far more than can actually be useful for gaming. But at some point they felt they had to market it as this ultimate gaming card which it ain't.

Keep the price and say it's for graphics artists and home animators and you instantly have a great product. But they ain't doing that.
Just Nvidia cashing in on reputation.
 
Yeah. I said it before but I'll say it again - Nvidia have been tripped up by their own marketing. The 3090 is an excellent prosumer card for rendering and ML and I think that's what it was designed for - hence the VRAM that is far more than can actually be useful for gaming. But at some point they felt they had to market it as this ultimate gaming card which it ain't.

Keep the price and say it's for graphics artists and home animators and you instantly have a great product. But they ain't doing that.
They can’t say that about the 3090 because of the insane power draw. The point behind prosumer cards is good performance with better efficiency, and the 3090 fails spectacularly at efficiency.
 
I was saving money for a new pc but the market is so fucked right now i don't even know what is the smart thing to buy, or if its smart to buy at all. If i didn't need a good PC as my workstation i'd call it quits and just buy a PS5 because its getting retarded, it seems the best strategy is to wait a few months and see how its shaping.

AMD is supposed to release something to compete with the 3080 i take it? Setting for anything feels risky as it probably will become obsolete in a couple years with shit new games and top of the line cards are adding, they all seem like a shit investment...
 
I was saving money for a new pc but the market is so fucked right now i don't even know what is the smart thing to buy, or if its smart to buy at all. If i didn't need a good PC as my workstation i'd call it quits and just buy a PS5 because its getting retarded, it seems the best strategy is to wait a few months and see how its shaping.

AMD is supposed to release something to compete with the 3080 i take it? Setting for anything feels risky as it probably will become obsolete in a couple years with shit new games and top of the line cards are adding, they all seem like a shit investment...
Depends on what you mean with obsolete and investment, graphics cards are perishable goods in a way, they age like parmigiano reggiano kept in a plastic bag in the back of your fridge. With few exceptions fridge cheese won't gain monetary value.

Seriously though. wait until summer, there's nothing available now and we're soon going into Chinese new year so there will be that bubble in manufacturing. Maybe July or August would be a good time to hopefully be able to buy something(anything) at MSRP. I'm pessimistic.
 
Depends on what you mean with obsolete and investment, graphics cards are perishable goods in a way, they age like parmigiano reggiano kept in a plastic bag in the back of your fridge. With few exceptions fridge cheese won't gain monetary value.

Seriously though. wait until summer, there's nothing available now and we're soon going into Chinese new year so there will be that bubble in manufacturing. Maybe July or August would be a good time to hopefully be able to buy something(anything) at MSRP. I'm pessimistic.
well by investment i mean buying something that i can get consistent performance over the longest time and is not too stupidly expensive. For work i don't need to overkill with gpu, although i'd like to get into 3d and video editing more. But it would also be good to know that i am up to speed with gayming since i am not buying a console on the side. I bought my current PC like 6 years ago, i usually last a long time with my computers so it would be frustating if in 2 years i am already cucked by the new mystical tech all developers decided was the standard for every new gen game. I also have a 4k tv and would be nice to game at good fps on that resolution.

Summer is probably best to check back again , as you say, its likely i'll be waiting till next christmas even. I have a Radeon R7 200 right now, i wonder if i could buy the motherboard , processor, ram come january and keep using that same card until i can replaced it with a modern gpu.
 
well by investment i mean buying something that i can get consistent performance over the longest time and is not too stupidly expensive. For work i don't need to overkill with gpu, although i'd like to get into 3d and video editing more.

Summer is probably best to check back again , as you say, its likely i'll be waiting till next christmas even. I have a Radeon R7 200 right now, i wonder if i could buy the motherboard , processor, ram come january and keep using that same card until i can replaced it with a modern gpu.
You can get good performance with surpsingly old cards. If you're getting into content creation like 3D and video you can get by with what you have, you will likely not push any real limits of your hardware and that's not a diss.

Hot tip: If you put X dollars down and feel that you get X dollars worth and it's X dollars you can comfortably spend then you've made a good deal. Look at how long your current card have held up.
 
I have a RX 580 on my 2nd machine and it's still more than good enough for 1080p high detail on most games. If you have a budget and don't mind used, it's still the best choice for most gamers.

I was regretting the 1660 Super I got late 2019 for $220 seeing the new card hype, but it's looking better and better as time goes on. Especially now since it appears the 3050 Ti will be close to 2060 performance but with Ray Tracing, DLSS for $250 MSRP. Meh.
People talk shit about the 1660 Super/ti but they're both pretty good cards for 1080p Ultra/1440p High settings. Good ~$250(MSRP) GPU for driving all but the least optimized modern titles.
I didn't buy into the RTX ON hype when the 20 series cards came out (then upgrading from the 970) and I still don't much care for it. So some reflections look better, big whoop. Pretty graphics won't make a shit game good.

If you have a monitor from the last 5 or so years that wasn't some bleeding edge $400+ model then you won't really need more than a 2070/5700XT for 60Hz+ refresh rates anyways. Anything more is a waste or outright dick-waving unless the goal IS raytracing or you have a top-tier monitor.

I'd imagine that the average home gamer doesn't have a high refresh rate display. The most common dedicated monitors (i.e. cheap old office monitors) are still 1080p60Hz while TVs tend to hover around 4K60Hz. Performance overhead is nice to account for worse software optimization down the line but if what you're after is AAA gaming at 1080p to 1440p don't bother with something like the 3080 or 6800XT. Too much money spent on shiny framerates that your monitor can't even display and resolutions you aren't using.

I'll never understand the consoomerism shit in PC gaming beyond the specs nerds and overclockers. The new shiny isn't always the best.
 
People talk shit about the 1660 Super/ti but they're both pretty good cards for 1080p Ultra/1440p High settings. Good ~$250(MSRP) GPU for driving all but the least optimized modern titles.
I didn't buy into the RTX ON hype when the 20 series cards came out (then upgrading from the 970) and I still don't much care for it. So some reflections look better, big whoop. Pretty graphics won't make a shit game good.

If you have a monitor from the last 5 or so years that wasn't some bleeding edge $400+ model then you won't really need more than a 2070/5700XT for 60Hz+ refresh rates anyways. Anything more is a waste or outright dick-waving unless the goal IS raytracing or you have a top-tier monitor.

I'd imagine that the average home gamer doesn't have a high refresh rate display. The most common dedicated monitors (i.e. cheap old office monitors) are still 1080p60Hz while TVs tend to hover around 4K60Hz. Performance overhead is nice to account for worse software optimization down the line but if what you're after is AAA gaming at 1080p to 1440p don't bother with something like the 3080 or 6800XT. Too much money spent on shiny framerates that your monitor can't even display and resolutions you aren't using.

I'll never understand the consoomerism shit in PC gaming beyond the specs nerds and overclockers. The new shiny isn't always the best.

Yep! Those cards are still fine for a lot of enjoyment. 480 / 580 were just under half a decade ago and I'd say we're just reaching the tipping point where they start to become an issue with the lastest games. Honestly, the difference in enjoyment between playing most games on Highest and on Medium is pretty small so long as you haven't let your monitor out-pace your GPU capability. If you've bought some whopping 4K monitor and trying to play at native resolutions then you have to have a newer card than that. But otherwise you can have plenty of fun with the 580.

I would like to get one of the newer graphics cards but frankly with them in the £500 region the hobby is simply too expensive for me now. I might consider the 6700 when it is released which I think will be around the £300 mark but even that is really more than I consider something I should spend on what is effectively a toy.

My next upgrade cycle is likely to be in a couple of years time. We'll have RDNA3, we'll have Zen architectures on 5nm and (iirc) DDR5. Now isn't a bad time to upgrade - this new generation of GPU really are a significant leap ahead of where we were a few months ago and PCI-Ev4 is now standard on new AMD systems which is great for SSD performance with a suitable drive. But I updated my system a few years ago and it's too soon.
 
You can get good performance with surpsingly old cards. If you're getting into content creation like 3D and video you can get by with what you have, you will likely not push any real limits of your hardware and that's not a diss.

Hot tip: If you put X dollars down and feel that you get X dollars worth and it's X dollars you can comfortably spend then you've made a good deal. Look at how long your current card have held up.

Got a GTX 1080 Ti in 2017. Still using it. Though the 1080 Ti is kinda too good. It cost about 30 percent more than the 1080 but was around 50 percent stronger at launch. I heard that the reason it had 11 GB VRAM was because the Titan XP had 12 and they worried that it would trash the more expensive Titan if not gimped in that way.

For the record, I can max CP77 with it with ray tracing off, but only at 1440p, not at 4K. At 4K it drops to around 20fps maxed and becomes rather cinematic. At 1440p it is between 40 and 60 fps which is very playable.

Unfortunately ray tracing is a meme now but won't be a meme in the next few years given both new consoles have it.

I might consider the 6700 when it is released

Good luck with that. Big Navi is even more of a paper part than Ampere.

For the record, my queue position for the 3080 I ordered back in September is now 141. It's been 3 months and I'm halfway through the queue. And Scan said they were only going to take as many orders as they could be confident of filling within 6 weeks.
 
Back