GPUs & CPUs & Enthusiast hardware: Questions, Discussion and fanboy slap-fights - Nvidia & AMD & Intel - Separe but Equal. Intel rides in the back of the bus.

When Nvidia's gpus stop being profitable and they drop out of the GPU market, and AMD follows suit because they're retarded, is Intel just gonna have a monopoly? Or are we just going to ditch discreet GPU's altogether for iGPU's?
 
  • Thunk-Provoking
Reactions: Brain Problems
When Nvidia's gpus stop being profitable and they drop out of the GPU market, and AMD follows suit because they're retarded, is Intel just gonna have a monopoly? Or are we just going to ditch discreet GPU's altogether for iGPU's?
Your question is based on an assumption that needs to be challenged in order to provide a meaningful answer.
 
I bought rtx 3060ti for my first PC build.
It was discounted like 60$ on amazon prime sale so I instantly bought it.
Then I learned that rtx 3060 had 12GB vram and rtx 3060ti got even cheaper few months after I bought it.
I'm happy with what I got but god damn what's going on with 3000 series?
 
Nothing was ignored, I just don't agree. It doesn't address when DLSS isn't an option or when more vram is needed. The 3060 is not a 4k card for recent titles.
Here's the 3060 running the latest Modern Warfare at 4K with DLSS and maintaining 80-90 fps.

So long as AI and the server industry is booming for Nvidia, they will just not care about the consumer market. Case in point
View attachment 5235168
This A100 costs around $10,000. Nvidia sold over 10,000 of them for just a single supercomputer. Compare the profit to your average customer buying one $300 card.

Can confirm, we have a bunch of these at work now. They cost a lot.

Supposed PC enthusiasts are all about celebrating weak hardware releases, upscaling, and now blaming devs for using too many resources/ not including preferred upscaling.

I think it's weird how angry it makes you that NVIDIA is supporting a feature that is in high demand now. I wonder what exactly it is you think you'd be getting if NVIDIA didn't invest in DLSS. It's not like they have magical extra transistors they're refusing to put on the chip just to make you mad.


When Nvidia's gpus stop being profitable and they drop out of the GPU market, and AMD follows suit because they're retarded, is Intel just gonna have a monopoly? Or are we just going to ditch discreet GPU's altogether for iGPU's?

The future of the discrete GPU market probably looks like the discrete sound card market - for professionals and enthusiasts only.
 
Last edited:
When Nvidia's gpus stop being profitable and they drop out of the GPU market, and AMD follows suit because they're retarded, is Intel just gonna have a monopoly? Or are we just going to ditch discreet GPU's altogether for iGPU's?
Intel would be the first one to drop out of discreet dGPUs, not the last.

iGPUs/APUs will eventually reign supreme at the low-end, hopefully with mega APUs also targeting the mid-range. With AMD leading the charge on that. But it isn't clear that AMD is not turning a profit on even the lackluster RDNA3 generation.

AMD has a similar problem to Nvidia. Expensive Epyc server CPUs literally share silicon (the same chiplets) with cheap desktop CPUs, but they still make both. If there's a glut of wafer supply, the poors can be sold to.

There can always be a high-end catering to prosumer/workstation users who want compute performance or gaming at 4090/Titan prices. Maybe we will see made-to-order giant APUs using chiplets to deliver high gaming performance while reducing latency compared to dGPUs.
 
I think it's weird how angry it makes you that NVIDIA is supporting a feature that is in high demand now. I wonder what exactly it is you think you'd be getting if NVIDIA didn't invest in DLSS. It's not like they have magical extra transistors they're refusing to put on the chip just to make you mad.
Less bullshit hardware like the 4060 would be great.

No really. Upscaling is cool until it's being used as an excuse to sell absolute shit. Pretty sure I told you this before.

Defend garbage if you must.
 
Less bullshit hardware like the 4060 would be great.

No really. Upscaling is cool until it's being used as an excuse to sell absolute shit. Pretty sure I told you this before.

Defend garbage if you must.

What you've told me is that if you find out a pixel was generated by applying tensor operations to a set of vectors rather than by applying shader operations to primitives, it makes you angry. You haven't actually told me what a GPU designed with the same die & power constraints of the 4060 Ti could do if only NVIDIA hadn't invested so much in DLSS.

Its raw resources, transistors and clock speed, put it pretty close to the Arc A770 and 6800 XT, which I suppose are "non-bullshit" cards, since neither Intel nor AMD had at the time invested much in hardware-accelerated ML-based upscaling. So you have some decent reference points to explain what you feel you're being denied - especially since the 16 GB version of the 4060 Ti just dropped.

Intel would be the first one to drop out of discreet dGPUs, not the last.

I wonder about this - Intel is in a better place than NVIDIA when it comes to packaging and promotion. If I'm Dell, and Arcs have established themselves as competent GPUs, if Intel's willing to cut deals by bundling core CPUs with Arc GPUs, I'm going to buy a lot more Arcs.
 
Last edited:
In my infinite magnanimity, I'm going to show some screenshots. My laptop has a 3050 Ti Mobile with 4 GB, so it needs all the help it can get, really. The screen is only 1080p, so no 4K screenshots (plus, you know, 4 GB....)

1080p native, 60 fps
1690516603182.png

1080p + DLSS Balanced, 90 fps
1690516559316.png

The above image is internally rendering at 1114x626, and here's what that looks like.
1690516769093.png

My desktop has a Radeon 6700 XT, and frankly, DLSS 2 is so good that I genuinely regret going with AMD this time. FSR just kind of sucks, and so does XeSS. They introduce tons of shimmering artifacts, while DLSS2 doesn't.
 
  • Informative
Reactions: Judge Dredd
I genuinely regret going with AMD this time. FSR just kind of sucks, and so does XeSS.
AMD is about 1/2 generations behind this shit perpetually. Don't mention AI/ML only cards.

Lenovo just released a 4060 ITX card in a prebuilt in China.
I just want something like R9 Nano again... Anyone know where I can get a used one that's not ebay scalping.
 
Last edited:
  • Like
Reactions: The Ugly One
In my infinite magnanimity, I'm going to show some screenshots. My laptop has a 3050 Ti Mobile with 4 GB, so it needs all the help it can get, really. The screen is only 1080p, so no 4K screenshots (plus, you know, 4 GB....)

1080p native, 60 fps
View attachment 5238492

1080p + DLSS Balanced, 90 fps
View attachment 5238488

The above image is internally rendering at 1114x626, and here's what that looks like.
View attachment 5238501

My desktop has a Radeon 6700 XT, and frankly, DLSS 2 is so good that I genuinely regret going with AMD this time. FSR just kind of sucks, and so does XeSS. They introduce tons of shimmering artifacts, while DLSS2 doesn't.
I guess they can get by the DX certification(if it still exists) if it can be turned off.
 
Last edited:
Id pay $1000 for an Rx8800X APU all day. put it in a tiny little case with a big fuckoff radiator, everything else solid state? Ive been dreaming of making that pc for 20 years.
 
Id pay $1000 for an Rx8800X APU all day. put it in a tiny little case with a big fuckoff radiator, everything else solid state? Ive been dreaming of making that pc for 20 years.
If you're talking about Strix Halo, it could be sold by the mini PC manufacturers because it's supposed to be too big to fit on the AM5 socket. Maybe some company would sell you a Mini-ITX motherboard with it.

So there could be many caveats for that product, especially the price. If it falls through, the best we can hope for are the usual desktop APUs, which would give you an upgrade path.

I think AMD should increase the size of future sockets like Intel did. Chiplets make it viable to make larger chips, and a shrinking DIY PC market could gravitate towards those chips even though they would cost more.
 
I just replaced the CPU bracket on my Alder Lake box with one of these bad boys, and the CPU package temp has dropped from 70 C to 53 C while just posting racist trannyhate on the world's most banned forum.


1690938651983.png


TL;DR, Intel should have had a mechanical engineer on staff to advise them that rectangular chips will have anisotropic thermal stresses and therefore will require a different mounting solution. Frankly, I'm shocked at how huge a difference this is making. If you have a 12th or 13th gen Intel CPU, you really need one of these brackets.
 
Last edited:
So I went and bough a new laptop on eBay for about a third of the retail price and shockingly didn't get scammed. Though it's clearly one of those "fell off the back of a truck" cases, it was literally delivered in a plastic bag.

Anyway, it's got a Ryzen 5 6650 with quad channel DDR 5 ram. Way more powerful than any laptop I've had before but I've noticed it takes way longer to boot than any other machine I have, maybe close to a minute. The longest chunk of that is the machine getting to the manufacturer splash screen, after that it boots into windows in seconds.

I've been doing some googling and looks like others have had similar experiences with ddr5 machines and are speculating it takes longer for the bios to boot up with ddr5 for some reason. Anyone here have similar experience? It's pretty annoying that it takes almost three times longer for this thing to boot than my 10 year old ddr3 machine.
 
Back