GPUs & CPUs & Enthusiast hardware: Questions, Discussion and fanboy slap-fights - Nvidia & AMD & Intel - Separe but Equal. Intel rides in the back of the bus.

Having a Mandela effect moment right now. I always thought that the K versions of Intel CPUs couldn't be properly overclocked, but apparently they can be. I'm feeling really confused.
K SKUs have unlocked multipliers while non-K do not. Usually K CPUs have a better integrated memory controller (IMC) and will be able to run faster ram at tighter timings. Non-K CPUs can still usually be overclocked by raising BCLK above 100mhz. It is dependent on the BIOS on the motherboard and requires an external BCLK clock generator.
 
I've mentioned this in another thread, but I still use a 4690k. It's 8 years old and still works great for web browsing, word processing, etc. The only reason I'm upgrading is that I'm long overdue an upgrade, and because it's starting to chug on the latest releases.
I'm running a 4770k and GTX650 and still doing fine with most day to day stuff. Video capturing can be iffy with audio desynching; video editing/rendering isn't bad with NVENC, though I know it could be better.

Though like you, currently sourcing parts for a complete upgrade, because it's been a long time coming.
 
If I plan on a building a PC within the next 6 months, should I try to wait and scoop up a 4080 on release or settle for 3080? If I were to buy a GPU, it would be around October/November.
 
If I plan on a building a PC within the next 6 months, should I try to wait and scoop up a 4080 on release or settle for 3080? If I were to buy a GPU, it would be around October/November.
Unless you have a 4k monitor or do production work that requires VRAM, I would advise you to purchase a new 3080 with good warranty and the best price you can find; when the 4080 comes out, the surge in demand would also affect the 3080.

While the inflation and the cryptocurrency crash makes the current GPU pricing reasonable, a lot of people are always vying for the latest in GPU technology, and it's only a matter of time before the GPU companies announce their latest lineup to bring about excitement.
 
how come we don't have an announcement for the 4000 series yet? its only a few weeks away from the two year mark and we should be getting leaks about them around now if they're starting production.
 
If I plan on a building a PC within the next 6 months, should I try to wait and scoop up a 4080 on release or settle for 3080? If I were to buy a GPU, it would be around October/November.
Hard to say as there's a lot of "if"s to worry about.

If tech YouTubers are to be believed, nVidia has a big overstock problem, scalpers are hoping to scalp the 4000 series on launch, and the GPU crash is going to hit in a big way around then.

Why this matters is that, if the tech YouTubers and their fans are right, nVidia can't charge much more than the 3000 cards of the next teir because why buy a 4070 at $600 when a 3080 is $400 or less for the same performance?

But then I take tech youtuber opinions with a pinch of salt. Supposedly graphics cards have been below MSRP since March or so, but I don't see it in the UK.

how come we don't have an announcement for the 4000 series yet? its only a few weeks away from the two year mark and we should be getting leaks about them around now if they're starting production.
Supposedly nVidia has a problem with over stock. They have way more 3000 series cards than they know what to do with and more on the way, and are even considering delaying the 4000 series, but if they do AMD will eat their lunch.

Meanwhile the supposed crypto crash has cratered used prices (I'm not seeing it here yet) so that's creating more pressure on them.

If that's true (and this is based on the tech YouTube rumour mill) then nVidia is likely holding off on announcing anything until they absolutely have to.
 
Hard to say as there's a lot of "if"s to worry about.

If tech YouTubers are to be believed, nVidia has a big overstock problem, scalpers are hoping to scalp the 4000 series on launch, and the GPU crash is going to hit in a big way around then.

Why this matters is that, if the tech YouTubers and their fans are right, nVidia can't charge much more than the 3000 cards of the next teir because why buy a 4070 at $600 when a 3080 is $400 or less for the same performance?

But then I take tech youtuber opinions with a pinch of salt. Supposedly graphics cards have been below MSRP since March or so, but I don't see it in the UK.


Supposedly nVidia has a problem with over stock. They have way more 3000 series cards than they know what to do with and more on the way, and are even considering delaying the 4000 series, but if they do AMD will eat their lunch.

Meanwhile the supposed crypto crash has cratered used prices (I'm not seeing it here yet) so that's creating more pressure on them.

If that's true (and this is based on the tech YouTube rumour mill) then nVidia is likely holding off on announcing anything until they absolutely have to.
that makes sense, but i highly doubt AMD could really beat them, the 3070 was half the price of the 6900 xt and was relatively the same performance wise. they still haven't come out with anything that beats the 3080 and even if they did. they'd have to use some tricks to make it beat the 3090 TI in performance metrics, even if given an extra year i doubt AMD would overtake them.

Honestly if it weren't for the massive shortage the 3000 series would have been dominant the price for the power truely was an absurd deal.
 
that makes sense, but i highly doubt AMD could really beat them, the 3070 was half the price of the 6900 xt and was relatively the same performance wise. they still haven't come out with anything that beats the 3080 and even if they did. they'd have to use some tricks to make it beat the 3090 TI in performance metrics, even if given an extra year i doubt AMD would overtake them.

Honestly if it weren't for the massive shortage the 3000 series would have been dominant the price for the power truely was an absurd deal.
LMAO, 3070 same performance as a 6900xt?

No.
 
that makes sense, but i highly doubt AMD could really beat them, the 3070 was half the price of the 6900 xt and was relatively the same performance wise.
Um, what? I'm re-reading your post to see if I'm missing some specific context such as performance-per-watt or mining bitrate (which I know little about) but I don't see anything like that. If you're talking graphics performance then certainly not. Perhaps if you're gaming on 1080p and both cards have maxed out the human-perceptible difference in terms of image quality. In that case, I could see what you're getting at. But if you're not then... I'm puzzled what you're talking about. 6900XT is a lot more capable than a 3070.
 
  • Agree
Reactions: Judge Dredd
Um, what? I'm re-reading your post to see if I'm missing some specific context such as performance-per-watt or mining bitrate (which I know little about) but I don't see anything like that. If you're talking graphics performance then certainly not. Perhaps if you're gaming on 1080p and both cards have maxed out the human-perceptible difference in terms of image quality. In that case, I could see what you're getting at. But if you're not then... I'm puzzled what you're talking about. 6900XT is a lot more capable than a 3070.
With no qualifiers you have to assume raw graphical power since that is what the vast majority look for in a GPU. With that in mind the comparison is nonsense since even my 6800xt shits all over a 3070.

I've seen team green doing a lot of cherry picking this generation.
 
  • Like
Reactions: Overly Serious
Um, what? I'm re-reading your post to see if I'm missing some specific context such as performance-per-watt or mining bitrate (which I know little about) but I don't see anything like that. If you're talking graphics performance then certainly not. Perhaps if you're gaming on 1080p and both cards have maxed out the human-perceptible difference in terms of image quality. In that case, I could see what you're getting at. But if you're not then... I'm puzzled what you're talking about. 6900XT is a lot more capable than a 3070.
Maybe in raytracing but who cares about that right now? I'm thinking that we're in a situation where raytracing will only be widely used with the PS6 and Xbox-whatever(and this will affect PC gaming as well). Until then rasterization will still be king.

Waiting for 4000 series or buying a 3000 card right now? Worst case scenario is that there's a lot of other people holding off on buying anything right now, and if scalpers snatch up all the 4000's then those that have been holding off will have to set their sights on the 3000s. If I was in the market and the price was agreeable I would buy a 3000 series card right now because who the fuck knows what will happen.
 
Maybe in raytracing but who cares about that right now? I'm thinking that we're in a situation where raytracing will only be widely used with the PS6 and Xbox-whatever(and this will affect PC gaming as well). Until then rasterization will still be king.

Waiting for 4000 series or buying a 3000 card right now? Worst case scenario is that there's a lot of other people holding off on buying anything right now, and if scalpers snatch up all the 4000's then those that have been holding off will have to set their sights on the 3000s. If I was in the market and the price was agreeable I would buy a 3000 series card right now because who the fuck knows what will happen.
Same boat. I have an old Radeon 480. I don't need a new card but I like to stay mostly current. It's coming up six years old now. If I got myself a 6600XT or a 6700XT it would probably be a pretty big boost. And prices have now come down closer to MSRP. Then again, about 4 months to go before the next gen is about.

Anyone think the next gen will have AV1 encoding? Might be a nice feature to have.
 
God, I can only hope V-cache 7000 series AMD CPUs can release before the end of the fucking world as we know it. Let me enjoy the last days of civilization instead of weeping all day.

They're flying off the shelves in the server market. Azure HBv3 bought up all the 64c 7003X EPYCs that AMD could make for the first run. For the applications I work with, there's a 1.3x-2x performance boost from the better cache, because with the "vanilla" 7003s, you stop getting much gain past 24 cores. I'm a little skeptical that this is going to matter much for gaming or video editing. Are cache stalls really that big an issue with the current Ryzens?
 
Same boat. I have an old Radeon 480. I don't need a new card but I like to stay mostly current. It's coming up six years old now. If I got myself a 6600XT or a 6700XT it would probably be a pretty big boost. And prices have now come down closer to MSRP. Then again, about 4 months to go before the next gen is about.

Anyone think the next gen will have AV1 encoding? Might be a nice feature to have.
Nice, I'm also on a 480 and it works just fine in the few new games I play. My "woe is me and my underpowered card" is that in recent years I have to fiddle with a couple of settings to get 1080@60 on high(not ultra). I would upgrade if a new GPU came with drivers that somehow forcibly changed screen-door transparency to alpha blending. It's the return of an effect seen in 3D games on the Amiga.
It's driving me insane with rage.
Screen-Door-Transparency.gif

Didn't the most recent GPUs support AV1? (AV1 also spawned avif which is even more contemptible than webp)
 
Didn't the most recent GPUs support AV1? (AV1 also spawned avif which is even more contemptible than webp)
I said AV1 encoding. Not decoding.

As to the format itself, I'd be very happy with h.265/HVEC (whatever it's called). I despise webp and I have no great love for AV1. However, Google and Netflix are both pushing AV1 hard so I expect it to become a common standard.
 
Nice, I'm also on a 480 and it works just fine in the few new games I play. My "woe is me and my underpowered card" is that in recent years I have to fiddle with a couple of settings to get 1080@60 on high(not ultra). I would upgrade if a new GPU came with drivers that somehow forcibly changed screen-door transparency to alpha blending. It's the return of an effect seen in 3D games on the Amiga.
It's driving me insane with rage.
View attachment 3474160

Didn't the most recent GPUs support AV1? (AV1 also spawned avif which is even more contemptible than webp)

Most games are using deferred rendering these days, which can't do transparency without extra rendering pass(es?). If that's how the game engine works, there's no way to force alpha blending; you'd have to have to fundamentally change the rendering engine.
 
I would upgrade if a new GPU came with drivers that somehow forcibly changed screen-door transparency to alpha blending. It's the return of an effect seen in 3D games on the Amiga.
I thought it was just me that noticed that, and assumed it was a quirk of my strange set up.

It's a shame as I like the effect on Saturn and other similar consoles. I'm guessing you're supposed to use motion blur and anti-aliasing to hide it?

Anyone think the next gen will have AV1 encoding? Might be a nice feature to have.
I hope so. I know it listed somewhere in one of AMDs roadmaps, but I forget where I saw it or how far in the future it was planned. AV1 encoding is the main reason I'm one of the only people still interested in ARC.

As to the format itself, I'd be very happy with h.265/HVEC (whatever it's called). I despise webp and I have no great love for AV1. However, Google and Netflix are both pushing AV1 hard so I expect it to become a common standard.
I'm interested in it because it's open source/licencing free, so adoption might be wider than 264/265. I'm hoping if AV1 lives up to the hype, it becomes the new standard for a long time like 264 was.
 
  • Informative
Reactions: Overly Serious
Most games are using deferred rendering these days, which can't do transparency without extra rendering pass(es?). If that's how the game engine works, there's no way to force alpha blending; you'd have to have to fundamentally change the rendering engine.
Sorry, it was a joke about how I would pay money to get rid of it through brute force, like a novelty SSAA solution for the modern era. I know it would look strange.
 
  • Like
Reactions: Brain Problems
Has anyone tried AMD link game streaming recently for windows or android clients? How does it compare to Nvidia Gamestream? I'm hoping to stream 1080p on 5GHz LAN WiFi.

Weird context:
A year ago, I set up a windows VM on a headless server that's connected to a GTX 1660 (GPU passthrough). Currently, I can stream games from that server using the moonlight client, either to my windows netbook or android tablet. Now, I'm hoping to upgrade to a newer AMD GPU for my server, but I can't seem to find good reviews on the AMD link experience.

Also, the AMD link windows client was a recent addition right? I recall back then when I set this up, there were only clients for mobile.

I followed this this guide for the headless gaming server setup btw in case you're curious:
 
Back