Second Sun
kiwifarms.net
- Joined
- Jul 12, 2021
More nvidia shenanigans.
Follow along with the video below to see how to install our site as a web app on your home screen.
Note: This feature may not be available in some browsers.
Idk, it's like Nvidia just can't help themselves from looking scummy.More nvidia shenanigans.
That's too bad. From what I've heard Intel has always had the superior video decode/encode capabilities. It's hard to find out the exact capabilities of VCN, but if multiple 2K/4K video playback on 2+ 4K monitors don't just work, that's lame. Maybe driver updates will help.Assembled my Ryzen 9 7900X workhorse build, but didn't have money left over for a GPU. Thought I could run a 4K monitor off it for desktop use with the meager integrated graphics for a bit, but whenever there are multiple videos on-screen, it seems to overwhelm the poor thing and the drivers crash. Apparently it looks like I'll need a discrete card sooner rather than later.
Switch from hardware to software decoding, you will burn more electricity but your problem is solved.Assembled my Ryzen 9 7900X workhorse build, but didn't have money left over for a GPU. Thought I could run a 4K monitor off it for desktop use with the meager integrated graphics for a bit, but whenever there are multiple videos on-screen, it seems to overwhelm the poor thing and the drivers crash. Apparently it looks like I'll need a discrete card sooner rather than later.
So after half a decade my rig is starting to show its age, and so am I, I feel like an old man looking at all this new stuff.
Are AMD cards worth getting? I know NVIDIA holds a large market share and everyone seems to have them but goddamn are the prices high, at least AMD has some nice sales going on.
I'm seeing DDR5 RAM starting to pop up is it worth getting over DDR4 for a gaming rig?
my current machine has:Yes, AMD cards are worth getting and are very competitive in the price band they've chosen to stick to. If what you want is to win bragging rights for achieving the highest fps in 4K until next year's video cards come out, get NVIDIA. If you aren't interested in spending over $1000 on a card, get whatever's on sale. In a given price band, they're pretty much all equivalent (even the Intel Arc is...about what you'd expect for the price). Game developers target consoles and midrange PCs, plus we're very close to an absolute power ceiling on what users will tolerate, so you'll be fine for the next 5-7 years.
Regarding DDR5, video game software is nowhere close to saturating the bandwidth of DDR4. Consequently, adding more bandwidth doesn't add much real game performance. Gaming benchmarks show, at best, around a 2%-3% performance improvement from going from the slowest DDR4-2133 to the fastest DDR5-6400. You will be absolutely fine with DDR4.
What is your current machine?
Radeon 6600's are $200 or so, if you just need something to stuff in the workstation and not worry about.Assembled my Ryzen 9 7900X workhorse build, but didn't have money left over for a GPU. Thought I could run a 4K monitor off it for desktop use with the meager integrated graphics for a bit, but whenever there are multiple videos on-screen, it seems to overwhelm the poor thing and the drivers crash. Apparently it looks like I'll need a discrete card sooner rather than later.
I'm toying around with the idea of taking the GTX1080 out of my 2017ish gaming PC (i7-7700K, 16GB DDR4) to put in the workhorse and buy something else for the gaming rig. It's hooked up to a 1080p TV right now, which I might replace with a newer 4K one within a year or so, at which point I'd need something beefy to run the graphics, but also small because cramped cubecase, which sounds like it might be a no-go.
So my dilemma, I guess, is if there's any sense in trying to prolong the i7's suffering with something like a shorty GTX3060 after I harvest the 1080 and be stuck with an aging sub-par rig, or if I should just suck it up, buy something new for the workhorse, and junk the i7 when I'm ready to completely replace it and eat cardboard and drywall for a few months.
my current machine has:
Intal i5 6600k
16gb Ram
Geforce GTX 1650
Aside from a GPU upgrade I did two years ago everything is 7 years old, It runs most of the games I play just fine, but its been getting harder and harder to play games even on the lowest settings. I have a nice 1080p monitor that just as old and I have no plans to upgrade it anytime soon.
thoughts/flamewars on i7-12700K vs. the i9-9900k? What major factors are there besides price?
lmao. tho you could always try your luck on ebayIs there anywhere to buy a 3080TI for less than 4 digits?
If you're just planning to play games there's not any real practical difference. *Generally* the slightly more expensive cards in the same tier will have marginally better cooling or overclocking headroom. But unless you're worried about benchmark dick-waving it isn't something you're going to notice.What's the difference between an RTX 3070 8GB V2 and an RTX 3070 GAMING Z TRIO 8GB LHR?
This along with that article from Techpowerup the other day gives me hope we'll see a 16 core x3d version. I would take it with more than a few grains of salt but I'll remain optimistic.Some possible leaks about upcoming Ryzen models with 3d cache. I think that's the guy that is often reliable with these leaks.
If it's a bigger leap than the 5800X3D was, then there will be a lot of glorious screeching about it.This along with that article from Techpowerup the other day gives me hope we'll see a 16 core x3d version. I would take it with more than a few grains of salt but I'll remain optimistic.
View attachment 4005954
Sudden wailing and consumer regret is a good sign of sorts at this point, it means something exciting happened!If it's a bigger leap than the 5800X3D was, then there will be a lot of glorious screeching about it.
With current inflation and such I wonder what the non-X Ryxen 7000's will end up being priced at in a year from now.
I just bought a 5800X3D, and it's my first time using an AIO. From what I've read, they are fairly safe these days but I'm still a little bit worried it'll leak at some point in the future. The 5800X3D can be had for $499 in Australia now, so I figured that should last me for a good 3-5 years (if not longer) and I can skip AM5.If it's a bigger leap than the 5800X3D was, then there will be a lot of glorious screeching about it.
From the results I've seen from the 5800X3D, yes. Some games might not see a big gain, but others get massive 20%+ increases literally from swapping from another Zen 3 cpu. It means that to this day, the 5800X3D still beats Zen 4 and Raptor Lake in some games.do they see much gain from larger L3 cache?
5800X vs. 5800X3D with the RTX 4090, in 53 games. The 5800X3D is running at a 200-300 MHz lower clock speed:What consumer workloads does cache affect? I know it does for what we do at work, but the applications we work with are memory bound, and we're finding massive gains going from DDR4 to DDR5, and similarly when AMD introduced the EPYC 7003X series CPUs. By contrast, if games aren't seeing much gain from bandwidth, do they see much gain from larger L3 cache?