GPUs & CPUs & Enthusiast hardware: Questions, Discussion and fanboy slap-fights - Nvidia & AMD & Intel - Separe but Equal. Intel rides in the back of the bus.

The 5090 is rumored to have a 448-bit bus and 28 GB of GDDR7 VRAM, an improvement over the previous generation, but just shy of the 30-32 GB some AI models are probably designed for.

Product segmentation at it’s finest. Although I get the impression that nvidia would sack off the gaming segment in a heartbeat if they could spin it in a way that wasn’t just “have you retards not seen the margin we get on this AI shit? Why would we use our wafer allocation on anything else?”.
 
These kinds of sanctions are incredibly pointless, since what matters for the kinds of stuff we don't want the Chinks to do is total compute in the datacenter, not compute per PCB.
It makes some of the middlemen rich because of all the smuggling.

It probably has at least a small impact on total compute, but it has also made China laser-focused on building up their own domestic alternatives. It's always possible that a new approach will make ASML's EUV tools obsolete, or their continued intensive spying on the Dutch could pay off.

I'm pretty sure their engineers are in America.
He probably meant Taiwan (TSMC+others), but there are many parts of the supply chain that aren't located in America. Maybe stuff like the fans are made on the mainland, IDK.

Product segmentation at it’s finest. Although I get the impression that nvidia would sack off the gaming segment in a heartbeat if they could spin it in a way that wasn’t just “have you retards not seen the margin we get on this AI shit? Why would we use our wafer allocation on anything else?”.
They'd like to, but they need something to fall back on when the AI bubble pops. Token efforts can address the gaming market in the meantime. Nvidia will continue to sell through existing Ampere and Lovelace inventory, then launch only the high-end (5090 and 5080) this year. Maybe the 5090 will have an MSRP of $1800, 5080 at $1000. These will be made using the inferior yields of the same Blackwell dies that will be used to make more expensive workstation GPUs.

Mid-range and low-end will take forever to come out, and can use 7nm/5nm nodes that have been abandoned by the latest AI accelerators. The 5060 might continue to use GDDR6 memory. If they wait a really long time, 3 GB GDDR7 chips will hit the market and they could reduce bus widths, and even make the mythical 9 GB GPU.
 
Mid-range and low-end will take forever to come out, and can use 7nm/5nm nodes that have been abandoned by the latest AI accelerators.

They can't use old nodes for new designs without significant reengineering. A 5nm chip would be a rebadged 40 series. Designs aren't even really process-portable any more. 30 series was on Samsung 8 nm and probably couldn't even be made on TSMC N7.
 
  • Agree
Reactions: N Space
They can't use old nodes for new designs without significant reengineering. A 5nm chip would be a rebadged 40 series. Designs aren't even really process-portable any more. 30 series was on Samsung 8 nm and probably couldn't even be made on TSMC N7.
Maybe rebadged. But we've seen RDNA3 on N4 (APUs), N5 (Navi 31/32), and N6 (Navi 33 monolithic die). N4/N5 should be design portable but N6 is part of the N7 family. We've also just seen Intel bragging about making their cores "process-agnostic" by using larger partitions of cells, so that they can put it on their own nodes or TSMC's, and different sizes (Crestmont was on Intel 4 for E-cores and N6 for LP E-cores).
2024-06-04_8-51-46-1456x819.jpg
 
Maybe rebadged. But we've seen RDNA3 on N4 (APUs), N5 (Navi 31/32), and N6 (Navi 33 monolithic die). N4/N5 should be design portable but N6 is part of the N7 family. We've also just seen Intel bragging about making their cores "process-agnostic" by using larger partitions of cells, so that they can put it on their own nodes or TSMC's, and different sizes (Crestmont was on Intel 4 for E-cores and N6 for LP E-cores).
View attachment 6111793

You can fabricate a cell designed for a given node on a higher-resolution node, worst case scenario achieving 0% shrink (this is basically happening with SRAM as of 5 nm). You can't go the other direction easily. Intel managed to modify Ice Lake cores, which were designed for the 10nm/Intel 7 process, to be fabricated on their 14nm process, but this wasn't a small effort. It is highly unlikely that we're going to see NVIDIA try to modify Blackwell to be fabricated on the coarser, hotter node. All that would happen is they'd get a GPU with heat & performance characteristics similar to the 40 series with some extra instructions.

If anything, they're going to want to gobble up as much of TSMC's leading-edge output as possible and block AMD out of the market.
 
Although I get the impression that nvidia would sack off the gaming segment in a heartbeat if they could spin it in a way that wasn’t just “have you retards not seen the margin we get on this AI shit? Why would we use our wafer allocation on anything else?”.
Actually, the big man has an answer for you:
https://morethanmoore.substack.com/p/q-and-a-with-nvidia-ceo-jensen-huang (archive)
Q: Adam Patrick Murray, PC World - Are you able to give us a taste of what 50-series RTX consumer GPUs will bring?

A: 50-series is something I will tell you about later. I can’t wait, but I can’t say anything. Not yet.

Q: PC Gamer - All the cool gaming stuff you announced - none was announced in the keynote. Do you still love PC Gamers?

A: I love PC gaming, but my keynote was already 2 hours. I didn’t want to torture you guys. It was too long anyway. But I love PC gamers - without the PC gamers we couldn’t build NVIDIA to what it is now.
For real though:
We have 100 million RTX GPUs in the world now, and you can’t find this anywhere else. When we see the future, we start investing straight away. The install base is what matters in computing, and CUDA has 100s of millions of compatible servers, so when you develop on CUDA you know it’ll work anywhere - on your laptop, or a server across the world. Software only comes with an install base. We’ve been visionary in seeing this coming, so all the RTX gaming PCs - they now become an AI PC.
Striking that balance and spreading their dual-use GPUs to lots of consumers has helped their business.
 
Last edited:
  • Informative
Reactions: Brain Problems
Okay, slight update on my quest to build a docking station rig for my ASUS laptop…
  • Monitor: I visited my local Best Buy to see how some models looked and I really wasn’t impressed with the ASUS monitor I linked earlier, it basically looked like a bigger version of my laptop’s display. Another issue I’ve been seeing with some of the cheaper monitors is that almost all of ’em are VA monitors and I’ve heard that those don’t play nicely with games that have dynamic cameras like FPSs. I did have a look at this Samsung Odyssey G4 monitor and this Samsung Odyssey G5 monitor (I’m specifically looking for monitors that are 27” or larger, I’ve kind of gone past the point of looking for ones that support 4K definition) and, while they do cost more than the ASUS monitors, it’s $100 off ($170 off for the G5) and it can be bundled with other accessories to drop the prices even further.
  • Speakers: I’m still looking for a good set, the Logitech Z407 speaker system I mentioned earlier are included in the list of eligible products that help bring the price of the monitor down but I’ve heard horror stories about the Bluetooth volume control dongle going bad and Logitech refusing to ship out a replacement. I’m looking at some other ones, most notably this Logitech Z623 speaker system but I’m still not sure how to tackle this.
  • Keyboard and Mouse: A friend of mine found an old HyperX Pulsefire FPS wired mouse (said model is old enough to not be visible on their site) I could have for free. He’s currently digging for a keyboard I could use but the last spare keyboard I saw him with was a cheapo Logitech wireless membrane keyboard with a built-in trackpad. If I do end borrowing these from him I’ll just go full autismo and get that nice Spongebob-themed keyboard/mouse/mousepad combo I saw and a separate number pad since that part is the only thing missing from it being 1-to-1 with my laptop’s keyboard.]
  • Docking station: You know how I was dead-set on getting this neat docking station by Anker? Well I’m not anymore, I noticed that the damn thing has a single HDMI port and no DisplayPort ports. I’m looking at some others that aren’t as snazzy but they provide the DisplayPort port and 100w PD. I’m currently looking at this one by j5create and this one by Dell.
  • Desk: As far as this one‘s concerned I’m not too worried, a close family member said that they’d pay for it as a birthday present.
The 3D printer situation is still the same but I’m educating myself on what I need versus what I want. And the what I want side is winning. Any advice on how to continue and where to look/what to look for is appreciated. I also noticed that we actually have a battlestations thread here, should I move this topic over there?
 
The install base is what matters in computing, and CUDA has 100s of millions of compatible servers, so when you develop on CUDA you know it’ll work anywhere - on your laptop, or a server across the world.

Unless your laptop or server doesn't have NVIDIA hardware. However, if you write it in Kokkos, RAJA, SYCL, or even OpenMP (do NOT do this), it will, in fact, run anywhere.

Okay, slight update on my quest to build a docking station rig for my ASUS laptop…
undefined

Does your laptop actually have a DisplayPort out?
 
Does your laptop actually have a DisplayPort out?
It doesn’t have an actual DisplayPort out, it has 7 ports on the left side (AC, Ethernet, HDMI, Thunderbolt 4, USB-C, USB-A, 3.5 audio) and 1 USB-A port on the right side.

…I have a feeling I may be retarded.
 
It doesn’t have an actual DisplayPort out, it has 7 ports on the left side (AC, Ethernet, HDMI, Thunderbolt 4, USB-C, USB-A, 3.5 audio) and 1 USB-A port on the right side.

…I have a feeling I may be retarded.

Tbolt 4 has enough bandwidth to at least theoretically support DP out, but you need to be sure the laptop actually supports this.
 
Tbolt 4 has enough bandwidth to at least theoretically support DP out, but you need to be sure the laptop actually supports this.
I checked ASUS’s site and the Q&A section of Best Buy’s listing, the Thunderbolt 4 port has DisplayPort support.

One thing I did notice is that my laptop screen has a 144hz refresh rate, after doing some research it says I should stick to having a monitor with a 144hz refresh rate. I guess it’s back to searching monitors again.

EDIT: I’m getting a lot of conflicting answers on the topic of monitor refresh rates, some say that it can support up to 240hz whilst others say that the max is 144hz, I assume that they’re talking about the laptop’s screen but it’s just getting fucky.
 
Last edited:
I assume that they’re talking about the laptop’s screen but it’s just getting fucky.
Honestly technical documentation has been in a horrendous state for so long now

You'd be lucky to get a page worth of highlights and then the many details that have real aswers are not shared with the customer
 
  • Like
Reactions: Just Some Other Guy
$1,300 model.

If the high-end Snapdragon X Elite laptops make their way into the bargain bin, I might buy one for $300. I am perfectly willing to spend $250, I'm reaching for my 2 Benjamins right now, $150 is quite fair, etc. They definitely aren't worth over $1,000, even if they are getting fancy specs like an OLED screen and... 16 GB of RAM. They will be demolished by AMD's Strix Point, likely even in efficiency/battery life which is an overrated point for these ARM chips.

SemiAccurate was right yet again. (archive)
 
Last edited:
$1,300 model.

If the high-end Snapdragon X Elite laptops make their way into the bargain bin, I might buy one for $300. I am perfectly willing to spend $250, I'm reaching for my 2 Benjamins right now, $150 is quite fair, etc. They definitely aren't worth over $1,000, even if they are getting fancy specs like an OLED screen and... 16 GB of RAM. They will be demolished by AMD's Strix Point, likely even in efficiency/battery life which is an overrated point for these ARM chips.

SemiAccurate was right yet again. (archive)
Is there a reason that getting 16gb of RAM is a fucking fight with laptops? Is it some sort of giant leap in costs? When I had to replace my work machine the guy at the store looked forlorn when I said I wanted something with 16gb minimum.
 
Is there a reason that getting 16gb of RAM is a fucking fight with laptops? Is it some sort of giant leap in costs? When I had to replace my work machine the guy at the store looked forlorn when I said I wanted something with 16gb minimum.
It really shouldn't be in 2024. It's not that expensive, and it's what gaming handhelds or other gaming devices are expected to include (see Xbox/PS5, or Steam Deck which was cheap at a $400 starting MSRP). Microsoft set 16 GB as the minimum to qualify for Copilot+ certification, so that will likely flush a lot of 4/8/12 GB laptops out.

Now if I can upgrade it myself, then I don't care if it comes with 0-8 GB. The only consideration there is that while SO-DIMM is reasonably priced, the new (LP)CAMM laptops will be very expensive to upgrade in the short term since it's an emerging standard.
 
  • Informative
Reactions: Brain Problems
Huh, apparently there is an MSI B650M Gaming PLUS WiFi motherboard that costs around $155 but it has been recently released and I don't see any benchmarks regarding it. Comparing it to $205 Gigabyte B650M Aorus Elite AX, which is the better pick? I'm worried regarding VRM, supported RAM speed (my RAM is 6000MHz) and longevity of the chosen mobo (is BIOS blashback necessary?).
 
Huh, apparently there is an MSI B650M Gaming PLUS WiFi motherboard that costs around $155 but it has been recently released and I don't see any benchmarks regarding it. Comparing it to $205 Gigabyte B650M Aorus Elite AX, which is the better pick? I'm worried regarding VRM, supported RAM speed (my RAM is 6000MHz) and longevity of the chosen mobo (is BIOS blashback necessary?).
I'd go MSI, I trust Gigabyte build quality as far as I can throw it, which isn't much.
 
Back