GPUs & CPUs & Enthusiast hardware: Questions, Discussion and fanboy slap-fights - Nvidia & AMD & Intel - Separe but Equal. Intel rides in the back of the bus.

Thanks for the tip guys. I am gonna be thinking about it.

I'll probably wait till january and start buying one part every month or so not too break too much bank at once. Like i did with my current one.

I am wondering if you could give me a little advice on the case? I know these newer gpus are pretty big and produce a lot of heat.
Any current PC case is going to be designed to dissipate the heat these things put out. If you are going with a 6700/3070 or less, there are some pretty cool mini itx options.

 
Gentoo Slackware OpenBSDor bust!

I have a question for you folks. How often do you build a new PC? I've always considered building a new build for the sake of a new built to be "wasteful" so I do not build until my previous system's original parts (CPU+MB) literally dies permanently. I do however keep a "backup" PC I can boot into just in case.
For me...
My current PC that I'm replacing is 7 years old.
My previous computer lasted about 5 years (the motherboard failed, and my harddrive was starting to go).
The one before that lasted about 4ish years.
Between these three I've only made a few upgrades including; adding RAM once, upgrading a powersupply once, and three GPU upgrades. I tend to use my computers until they crash and burn, in a similar way to you it seems.
 
I am looking at comparisons between the Ryzen 5 5600 and the Ryzen 7 5700xt . I wonder if there's really any noticeable difference.

The 5600 seems like a great deal, i am seeing it at 140 bucks and already has a cooler bundled in. Does 2 extra cores of the 5700xt will be worth it? it would be 60$ more plus whatever the cooler costs me.

I am also realizing the Radeon 6700xt would be a better choice of gpu for a little extra than the 3060 as you guys said.

My current PC that I'm replacing is 7 years old.
My previous computer lasted about 5 years (the motherboard failed, and my harddrive was starting to go).
The one before that lasted about 4ish years.
Between these three I've only made a few upgrades including; adding RAM once, upgrading a powersupply once, and three GPU upgrades. I tend to use my computers until they crash and burn, in a similar way to you it seems.
i am 8 years in with this one. The mobo already failed once and had to replace it with an used one a couple years ago. I upgraded it with some extra ram and also bought small 200gb ssd just to install windows and my work programs but the damn thing died on me after only 3 years of use.
 
The 5600 seems like a great deal, i am seeing it at 140 bucks and already has a cooler bundled in. Does 2 extra cores of the 5700xt will be worth it? it would be 60$ more plus whatever the cooler costs me.
If you can't justify the cost, then it's not worth it. The option for a 5950 or 5800X3D is always open for you in the future. Get a quality motherboard and you'd done your best with what you have.
 
If you can't justify the cost, then it's not worth it.
you are probably right. The benchmarks i am seeing don't even make much sense because both cpus destroy every game they are testing even on max settings but one is like 140 fps and the other 145 fps. Don't think i'll really need more for the programs i work in.



Get a quality motherboard
How does this one look?

 
you are probably right. The benchmarks i am seeing don't even make much sense because both cpus destroy every game they are testing even on max settings but one is like 140 fps and the other 145 fps. Don't think i'll really need more for the programs i work in.




How does this one look?

I have the Z690 Intel version of this motherboard, and it's excellent. If I follow through with my stupidly-overpriced AMD build, I will be getting the x690e version of this motherboard.
 
My dad doesnt work on Nvidia and is totally a coincidence that i predicted they were going to pull this shit

2022-12-20 21.49.15 www.youtube.com 4b6282301b13.jpg

Like tech jesus said and you can see almost in any retailers no one want the 4080, NVIDIA refuse to cut the price of it and they are now accumulating dust in shelves, the card sucks plain and simple and NVIDIA is just putting excuse after excuse to not admit they blow it out
 
My dad doesnt work on Nvidia and is totally a coincidence that i predicted they were going to pull this shit

View attachment 4121544

Like tech jesus said and you can see almost in any retailers no one want the 4080, NVIDIA refuse to cut the price of it and they are now accumulating dust in shelves, the card sucks plain and simple and NVIDIA is just putting excuse after excuse to not admit they blow it out
Did you predict that before the article was posted 2 weeks ago?

Officially, there is only one RTX 4080, the one currently rotting on store shelves, although sales might have picked up after some of the bad reviews for 7900 XT/XTX.

The 4070 Ti may have the exact specs of the 4080 12 GB, or they could cut it down a little further and launch a 4070 SUPREME later.

Eventually, Nvidia will get the 4080 price down to $1000. People are likely more interested in 4070/7700 or lower anyway.
 
  • Informative
Reactions: Brain Problems
I am looking at comparisons between the Ryzen 5 5600 and the Ryzen 7 5700xt . I wonder if there's really any noticeable difference.

The 5600 seems like a great deal, i am seeing it at 140 bucks and already has a cooler bundled in. Does 2 extra cores of the 5700xt will be worth it? it would be 60$ more plus whatever the cooler costs me.
The 5600 is extremely good for the price, and anything above it for gaming alone is not worth it IMO.
Keep in mind that the stock cooler is very cheap and small, they cheaped out on them since the second gen ryzen. If you end up buying one, monitor temperatures and if necessary buy a better cooler.
 
. I tend to use my computers until they crash and burn, in a similar way to you it seems.
I do the same. None of my last 3 had a hardware failure but they just became too functionally obsolete even though they still ran fine, and I suddenly had to downsize a shitload of physical stuff all at once.

I still miss my beige box but the bitter truth is I can do 99% of everything it did on Dosbox and Windows 7.
 
you are probably right. The benchmarks i am seeing don't even make much sense because both cpus destroy every game they are testing even on max settings but one is like 140 fps and the other 145 fps. Don't think i'll really need more for the programs i work in.




How does this one look?

I haven't looked into motherboards in the last five years, it looks alright after skimming. If you want to get an in-depth review, I recommend Actually Hardcore Overclocking, his analysis is what I used to get my current board
 
I'm really regretting not jumping on that liquid cooled 6900xt for $700 a week or two back. GPU prices don't look like they're going to improve in the foreseeable future.
 
  • Feels
Reactions: Brain Problems
I do the same. None of my last 3 had a hardware failure but they just became too functionally obsolete even though they still ran fine, and I suddenly had to downsize a shitload of physical stuff all at once.

I still miss my beige box but the bitter truth is I can do 99% of everything it did on Dosbox and Windows 7.
I'm not quite there yet but I'm getting closer to an upgrade. My CPU is a few generations behind, DDR5 RAM would be near twice as fast as my current RAM and I'd jump from PCI-E v3 SSDs and disks. GPU wise I'm several generations behind but I barely game so that's not a big deal. I think when PCI-E v5 SSDs are available and affordable that will be the final thing that tips every category over into me being 2+ generations behind. Which I feel might justify a jump up. Everything is so expensive now, though.
 
Did you predict that before the article was posted 2 weeks ago?

Officially, there is only one RTX 4080, the one currently rotting on store shelves, although sales might have picked up after some of the bad reviews for 7900 XT/XTX.

The 4070 Ti may have the exact specs of the 4080 12 GB, or they could cut it down a little further and launch a 4070 SUPREME later.

Eventually, Nvidia will get the 4080 price down to $1000. People are likely more interested in 4070/7700 or lower anyway.
The whole thing was pretty obvious from when they scrapped the 4080 12GB. They paid manufacturers for the cost of the boxes they had printed so it was crystal clear that the card would be released but NOT as a 4080 of any kind. Rebranding it as a 4070 Ti was the logical thing to do.
 
@AmpleApricots have you ever used a vector monitor? It's niche as hell but it needs to be seen because it's revelatory, a real holy shit moment.
I am not AmpleApricots but I'd not heard of such things - or at least not known that I had. Just looked them up and they're basically infinite resolution in a sense. I found this which was cool:


Space_Rocks_(game).jpg

Is that what you're talking about? I wonder if the principles could be re-investigated with newer technology. Would be very interesting to have a monitor or TV that didn't have billions of tiny dots comprising its screen. And I wonder if the images could be transmitted to the monitor in a more efficient way as instructions rather than what are probably essentially bit maps / differentials between bitmaps? Heck, I wonder if GPUs themselves might have ways to be more efficient if they were returning drawing instructions rather than said bitmaps.
 
@AmpleApricots have you ever used a vector monitor? It's niche as hell but it needs to be seen because it's revelatory, a real holy shit moment.
Old oscilloscopes used these technologies, then there's also the Tectronix 4010 which is a terminal that does vector graphics. Little known thing - the terminal emulator xterm for linux to this day has a tectronix emulation mode that for example can be seen in action with gnuplot. These things are a bit before my computing time though. Tectronix emulation was quite widespread in some Amiga, DOS, and Mac terminal software and that's where I saw it most really, which was more for legacy reasons and the fact that these computers were cheaper to aquire than a tectronix than anything else. Interesting thing about the phosphor in the genuine Tectronix screens is that it only needs to be hit by the ray gun once and then will glow for a relatively long time. (vs. many refreshes with the normal phosphor) The reasoning for this once again was technical necessity - if you have a raster graphics based display with a high refresh, you need memory as screen buffer to put the pixels into and memory used to be easily the most complicated (read: expensive) part of modern computer technology. If you could eek away with not needing that buffer, you could save quite a bit of money. Also at least for a while after drawing, the graphics were very sharp and wouldn't flicker because there was no refresh per se and you just looked at glowing phosphor, which is easier on the eyes. You also could go really high-res and intricate which would've been prohibitively expensive with anything raster-based because of the memory required. That's why they were very popular for CAD stations for example for doing PCB design which lends to that whole vector graphics thing. Downside is, screen burn-in and also the lines would eventually get blurrier, you needed a screen refresh then. The screen refresh would include erasing the entire screen first and then redrawing it, which means you couldn't use this for moving pictures. Where did you see one?

In a wider and only somewhat related sense, there's also a gaming console doing vector-based graphics, the Vectrex. It wasn't popular in my corner (is a brit thing) of the world but I actually saw one. Something the youtube videos on this will not be able to show you is how smooth the output of these technologies is when doing animation. We kinda lost smooth, stable output in the last thirty years and it doesn't look like it'll ever be back. (The vectrex didn't use any fancy phosphor or special tubes though, it was all in the control circuity) I think using the technology in this console was inspired by the Asteroids arcade game, but I frankly don't really know.

TV that didn't have billions of tiny dots comprising its screen
Since I got really interested in small high-DPI screens (everything 220+ PPI) I'd say it comes close enough. A lot of problems you have in lower DPI screens (visible pixels, aliasing etc.) aren't really a problem anymore at these pixel densities because while still technically "there", these effects are just too tiny to be visible to the naked eye. It's surprisingly difficult to buy such screens though. Not that they're expensive (and they are) they're just actually rare. It is almost like screen manufacturers decided 94-100 PPI is enough because that's where most screens lie. It's probably the most cost-effective density is my guess. A 16:10 or 3:2 oled monitor with 250+ PPI of max 14-16" would be my dream, but such a thing literally does not exist as finished, stand-alone product.

The closest to the tectronix storage tube idea of just storing the picture inside the screen is eink is my guess. While inherently raster-based, it also has this memory effect just with eink it doesn't fade. Also some modern eink screens are really fast with erasing/refreshing and can reach framerates of up to 40 FPS.

GPUs themselves might have ways to be more efficient if they were returning drawing instructions rather than said bitmaps
Old GPUs (think ISA vintage) already had "GUI acceleration" which basically was simple line/box/filled box etc. drawing functions, so that the CPU didn't have to do it all itself and push it to the graphics cards memory pixel-per-pixel. Because everything in the PC world happened bitmap-based already back then it just rarely was effectively used. I can't imagine it'd matter much nowadays, at any rate. An e.g. 16000x16000 picture (that's 256 megapixels!) would be a 768 mb 8-bit BMP. For 1979 that'd be an insane size, for today - meh.
 
I am not AmpleApricots but I'd not heard of such things - or at least not known that I had. Just looked them up and they're basically infinite resolution in a sense. I found this which was cool:


View attachment 4152972

Is that what you're talking about? I wonder if the principles could be re-investigated with newer technology. Would be very interesting to have a monitor or TV that didn't have billions of tiny dots comprising its screen. And I wonder if the images could be transmitted to the monitor in a more efficient way as instructions rather than what are probably essentially bit maps / differentials between bitmaps? Heck, I wonder if GPUs themselves might have ways to be more efficient if they were returning drawing instructions rather than said bitmaps.

Biggest problem with a vector monitor is it's limited to one color. A color scanline TV has individually colored phosphors, but if you do that, you're back to having pixels again. Plus, CRTs are huge and heavy, which is why they died out. At some point, it was still a little more expensive to make flat screen displays, but the cheaper shipping costs flipped the calculus in their favor. You can't do this with LCDs or LEDs.
 
  • Informative
Reactions: Overly Serious
Back