GPUs & CPUs & Enthusiast hardware: Questions, Discussion and fanboy slap-fights - Nvidia & AMD & Intel - Separe but Equal. Intel rides in the back of the bus.

Lmao, Nvidia shit in the Philippines has been reduced to 1060s and 3050s. I don't even see the 2000 series GPUs there. It's a fucking ghost town there. Since you guys are experienced in GPUs, what's more appropriate for the 6700XT: Sapphire PULSE or SPEEDSTER SWFT309? There's too many variations.
 
Got some hate there lol. Any reason?
It's a gimped card that should have had more vram. It heralded Nvidia trying to "ackshully silly gamers, you don't need that much vram", see the cope they put out right before the terrible 4060 launch.

I don't hate the card itself, just what it represents. Same for the 4060 and 4070.
 
It's a gimped card that should have had more vram. It heralded Nvidia trying to "ackshully silly gamers, you don't need that much vram", see the cope they put out right before the terrible 4060 launch.

I don't hate the card itself, just what it represents. Same for the 4060 and 4070.
I'll admit the 8 gigs of Vram in the 3070 is not great. The 3060 at least gives you 12, which is something
 
I'll admit the 8 gigs of Vram in the 3070 is not great. The 3060 at least gives you 12, which is something
Yeah I just don't like this period we're in where companies make underpowered gpus, then sell them with big fps bar graphs that rely on the latest iteration of software to come anywhere near the claims.

Then when a game dev doesn't use the tech for whatever reason we have to stir up big ole stories about how they were paid off to not use it.

Instead of asking why we're being sold underpowered hardware for top shelf prices.
 
Since you guys are experienced in GPUs, what's more appropriate for the 6700XT: Sapphire PULSE or SPEEDSTER SWFT309? There's too many variations.
My instinct says always go for Sapphire which has a close relationship with AMD, but better check with someone else.

This is true. Not too long ago they released the 3060 gimped edition with less vram and I believe memory bus.

While keeping the name and whatnot the same.

I don't think there are many on the market, but double check obviously.
 
Yeah I just don't like this period we're in where companies make underpowered gpus, then sell them with big fps bar graphs that rely on the latest iteration of software to come anywhere near the claims.

Then when a game dev doesn't use the tech for whatever reason we have to stir up big ole stories about how they were paid off to not use it.

Instead of asking why we're being sold underpowered hardware for top shelf prices.
Yeah it's bullshit. If I buy a 3070 over a 3060, I expect at least equal if not more RAM, not cheap tricks that just barely make it better. All I'm asking is to play modern games at a ok frame rate and at 1080p WITHOUT lag. Like I'll take 30fps if I have to, I really am not that picky, just let me play baldurs gate bruh. I care more about the game than Peach fuzz graphics
Not all of them, make sure you double check
Haven't seen many of those on Newegg tbh, most are 12, even the cheapest 3060, but I'll be vigilant.
 
Yeah it's bullshit. If I buy a 3070 over a 3060, I expect at least equal if not more RAM, not cheap tricks that just barely make it better. All I'm asking is to play modern games at a ok frame rate and at 1080p WITHOUT lag. Like I'll take 30fps if I have to, I really am not that picky, just let me play baldurs gate bruh. I care more about the game than Peach fuzz graphics

Haven't seen many of those on Newegg tbh, most are 12, even the cheapest 3060, but I'll be vigilant.

Any card out now is going to be able to run any game out now at 1080p and some mix of settings that look very good. My laptop has a 6 GB 3050 mobile, which is a downclocked version of the base card, and games look great.
 
Any card out now is going to be able to run any game out now at 1080p and some mix of settings that look very good. My laptop has a 6 GB 3050 mobile, which is a downclocked version of the base card, and games look great.
Well that's great to know. Really that's all I want. I'm not picky. Rather take the money and invest in storage instead than a massive card
 
Just search youtube vides for games you want to play + 1080p + card you want to buy, you'll see a variety of people talking about how fast it runs, and on what settings. Minimum settings at 1080p is for 7-year-old cards with 3 GB, you will be fine.
Well that is good to know, truly. Me being stingy on cards for builds for others seems to have been alright then. I think I still want a 30 series card for expansion options, just a little extra breathing room, that and the 3060 is fairly cheap all things considered. For reference, the screen is 99% coming from a flea market. I just need it to run well and look good on that.
 
Newegg is now allowing card trade ins:

Upgrading to a new graphics card can be a hassle, and it has been even more difficult ever since the GPU shortage. Today, there are way too many models to choose from, and keeping track of prices is not easy. In an effort to make things a bit simpler, Newegg has announced a new trade-in program. The online retailer is offering customers a deal in which they send in their existing eligible GPU and receive a trade-in credit amount toward the purchase of a new qualifying graphics card.

According to Amir Asadibagheri, product manager of customer experience for Newegg, “the benefit of our trade-in program is the ease to send a used graphics card and buy a new one all within the same platform and avoiding the hassle of selling through a secondary market.” Newegg has given a list of all Nvidia and AMD graphics cards that are eligible, along with an estimated trade-in value. Notably, the trade-in is limited to Nvidia’s RTX series and AMD’s Radeon 5000 series and beyond.

AA1hiW2s.jpg


From what we can see, the trade-in value is not as impressive as one might get from various resale marketplaces. For instance, the trade-in value for an RTX 3070 Ti is listed at $237. However, if you go and check the prices on eBay, it is possible to get anything from $250 to $400 depending on the model and condition of the card. On the flip side, Newegg mentions how its program offers a more seamless experience. One does not need to go through the process of listing their product on a reseller platform, paying a seller fee cut, or relying on a private party to complete the transaction. If you are interested in the trade-in program, here’s how the entire process works:

  1. Customers looking to upgrade to a new graphics card in exchange for their existing GPU can easily identify eligible products by spotting a banner on most graphics card product pages on Newegg.com. A pop-up form allows them to verify if their graphics card qualifies for the trade-in program. This form collects essential details such as chipset, model, and brand.
  2. Based on the information provided by the customer, an initial shopping credit offer for a new graphics card will be displayed on the website. If the customer decides to accept the trade-in offer and proceeds to purchase a new graphics card from the website, they will receive a prepaid shipping label and detailed instructions via email. This facilitates the hassle-free shipping of their graphics card to Newegg at no expense, with a 14-day window for completion.
  3. After Newegg receives the shipped graphics card, a thorough inspection is conducted to confirm that it aligns with the details provided during the initial submission. Once the verification process is successfully completed, the customer will be credited with the agreed-upon amount, which will be applied to their new graphics card purchase. The credit will be refunded to the customer’s account using their original payment method. However, in cases where the provided information is inaccurate or Newegg declines the submission, no credit will be issued, and the graphics card will be returned to the customer.
It is advised to read all the terms and conditions before taking part in the promotional trade-in program. According to Newegg, customers are solely responsible for all costs associated with the packaging of the trade-in device for shipment if it is rejected. Any trade-in GPU that fails to meet Newegg’s inspection or is otherwise unacceptable for any of the reasons listed will be rejected.

My thoughts: I sold my 6700XT recently for $200 bucks because fuck it, thats what it's worth. People overvalue their used shitty gear and I'll never understand why people pay it. I used to work in Sound Production and my god was it egregious there. People still out there trying to get $550 for a 5 year old MAudio Keyboard that cost $600
 
Morning predictions:

Intel's process node will catch back up to TSMC. After that point, AMD will no longer be able to rely on manufacturing alone to beat them. Here's how the market will shake out:
  1. AMD will continue to win big in the server/workstation space (workstation CPUs are basically the same as server CPUs). That's where chiplet design, 3D V-Cache, and other AMD technologies deliver enormous gains, and Intel is just behind there. Even with manufacturing parity, AMD is ahead, and this is now their game to lose. Intel will stabilize, but has no viable path pack to a 95% market share.

  2. The story is different in desktop & laptop CPUs. Heterogeneous cores have been a huge success there, largely closing the raw performance gap with AMD, despite the node disadvantage. Consequently, Intel has mostly held onto its market share in this space. AMD's 3D V-Cache can squeeze a few more frames out of a 4090, but otherwise has almost no effect on consumer computing. Intel's designs have been more forward-looking than AMD's since Alder Lake hit, and Meteor Lake is pushing that forward even more. When Intel Foundry closes the gap with TSMC, it's over for AMD here.

  3. NVIDIA will continue to dominate AI, but Intel will take the #2 spot away from AMD. The simple reason is that Intel is far better at software than AMD, and have done a lot of work, especially with oneAPI, to ensure their hardware is easy to support. AMD's software is a shambling mess. Half of it is outsourced to India, it crashes constantly, and the documentation is often wrong.
 
NVIDIA will continue to dominate AI, but Intel will take the #2 spot away from AMD. The simple reason is that Intel is far better at software than AMD, and have done a lot of work, especially with oneAPI, to ensure their hardware is easy to support. AMD's software is a shambling mess. Half of it is outsourced to India, it crashes constantly, and the documentation is often wrong.
AMD could be selling out GPUS across the board if they got ROCm working on Windows properly.... instead they just released a brand new card I can't recommend to anyone because everyone wants to play with AI now and the software is so bad that the Arc gpus are now beating top dollar AMD cards as XESS is already being widely implemented. I am so butthurt about this too.
 
  • Feels
Reactions: Brain Problems
Morning predictions:

Intel's process node will catch back up to TSMC. After that point, AMD will no longer be able to rely on manufacturing alone to beat them. Here's how the market will shake out:
  1. AMD will continue to win big in the server/workstation space (workstation CPUs are basically the same as server CPUs). That's where chiplet design, 3D V-Cache, and other AMD technologies deliver enormous gains, and Intel is just behind there. Even with manufacturing parity, AMD is ahead, and this is now their game to lose. Intel will stabilize, but has no viable path pack to a 95% market share.

  2. The story is different in desktop & laptop CPUs. Heterogeneous cores have been a huge success there, largely closing the raw performance gap with AMD, despite the node disadvantage. Consequently, Intel has mostly held onto its market share in this space. AMD's 3D V-Cache can squeeze a few more frames out of a 4090, but otherwise has almost no effect on consumer computing. Intel's designs have been more forward-looking than AMD's since Alder Lake hit, and Meteor Lake is pushing that forward even more. When Intel Foundry closes the gap with TSMC, it's over for AMD here.

  3. NVIDIA will continue to dominate AI, but Intel will take the #2 spot away from AMD. The simple reason is that Intel is far better at software than AMD, and have done a lot of work, especially with oneAPI, to ensure their hardware is easy to support. AMD's software is a shambling mess. Half of it is outsourced to India, it crashes constantly, and the documentation is often wrong.
How much faith do you have in Intel to stick to it's timeline for Intel 4, 3, 20A etc? I know they have their expected release dates, some are already being taped out, etc, but I just can't have that much belief in Intel's schedule when they spent as much time as they did on Skylake refreshes.
 
AMD could be selling out GPUS across the board if they got ROCm working on Windows properly.... instead they just released a brand new card I can't recommend to anyone because everyone wants to play with AI now and the software is so bad that the Arc gpus are now beating top dollar AMD cards as XESS is already being widely implemented. I am so butthurt about this too.

Hobbyists buying gaming GPUs for dual-use computing are a negligible fraction of the market, and discrete GPUs are a declining market anyway. There are three important fronts in AI/ML where the real money is:

  1. Datacenter GPUs. NVIDIA is winning here, and will continue to win. Google just spent hundreds of millions of dollars on H100s, for example. AMD is struggling here, because NVIDIA's software stack (everything from CUDA to Omniverse) is better than theirs...but so is Intel's. A colleague of mine couldn't even get AMD's tools to work without modifying their source code. PVC is a day late and a dollar short, but Intel will fix its hardware problems long before AMD fixes its software problems.

  2. Desktop applications. Gaming GPUs are DOA as mainstream AI/ML products. Nobody wants to buy a big, fat ULTRA GAMER POWERED tower so their automatic photo tagging runs a bit faster. And on laptops, just forget it. The winning hardware solution here is going to be integrated into the CPU, and NVIDIA has no entry into this market. I have no read on whether Intel's solution is better than AMD's, because I just don't have a deep understanding of neural engine hardware, but what I do know, for a fact, is the software they're deploying for it, from dev tools to libraries, are already crushing AMD.

  3. Startups. You get the startups, you get the future. Startups are extremely cash-bound and time-bound. Nobody wants to fuck around with half-broken tools or pay a 10x premium for ultra-powerful hardware when you have no money. In the world of accelerated computing right now, you can get started supporting both Intel and NVIDIA with a $1500 laptop, and work through clean, clear, tutorials they have online, and have your dev stack up and running in a week. With AMD, you're SOL unless you have $10K to plop down on one of their newer instincts, not to mention the $7K+ workstation or server node to put it in - don't bother with the old ones, you think shit breaks all the time on an Instinct, try doing it on an older Instinct. I already know of multiple projects that will launch on Intel & NVIDIA, while ignoring AMD, because it's just that much of a pain in the ass to work with AMD. In other words, if you have an NVIDIA or Intel GPU (including integrated Xe GPUs), or an Intel neural engine, it will use that, and if you have all-AMD hardware, lol get fucked, it falls back to x86 mode.
The dark horse here is GPUs may not actually be ideal for AI/ML. GPUs were designed for graphics, and it just so happened, by coincidence, that hardware designed to turn polygons into pixels is far better at AI/ML than a traditional x86 CPU. But Intel, AMD, and Apple all claim their neural engines are better at the task than GPUs, and Intel's Gaudi2 is faster than H100 on a price/performance basis - which is especially interesting, since it's fabricated on TSMC N7, while H100 is on N4. All things being equal, you'd expect the hardware on the more advanced node to just shit all over the competition--which implies all things are not equal. NVIDIA has all the mindshare right now, in no small part because today's AI/ML developers fondly remember running Doom 3 on a GeForce 4 Ti, but their strategy of using the same architecture on both gaming GPUs and AI/ML accelerators may not be sustainable.

How much faith do you have in Intel to stick to it's timeline for Intel 4, 3, 20A etc? I know they have their expected release dates, some are already being taped out, etc, but I just can't have that much belief in Intel's schedule when they spent as much time as they did on Skylake refreshes.

Everyone behind that debacle has been fired, and the people who said, at the beginning, that they were fucking up are now running the show. Biggest personnel change is Gelsinger himself. He should have been made CEO over Krzanich, but in 2012, nobody wanted a CEO who talked about dumb nerd stuff, like high-powered lasers and semiconductors. They wanted a CEO who cared about diversity and online bullying.
 
The story is different in desktop & laptop CPUs. Heterogeneous cores have been a huge success there, largely closing the raw performance gap with AMD, despite the node disadvantage. Consequently, Intel has mostly held onto its market share in this space. AMD's 3D V-Cache can squeeze a few more frames out of a 4090, but otherwise has almost no effect on consumer computing. Intel's designs have been more forward-looking than AMD's since Alder Lake hit, and Meteor Lake is pushing that forward even more. When Intel Foundry closes the gap with TSMC, it's over for AMD here.
Intel's heterogeneous approach is interesting, but overrated in relation to AMD. AMD is expected to follow suit with 4+8 Strix Point mobile (12 cores, 24 threads). If you need more multi-threaded performance than that in a laptop, they will have 16-core Dragon/Fire Range and maybe Strix Halo.

Efficiency is where it's at for laptop and AMD has been strong there with Rembrandt and Phoenix. I think Intel has a potentially great idea with the two LPE cores in the SoC tile. We'll have to see how well Windows handles it in practice.

AMD got its XDNA AI chip in mobile first. Maybe Intel will beat them in software support, but it's kind of important for the future of AMD and they bought more expertise with the Xilinx acquisiton. I'm looking forward to seeing if AMD pushes XDNA hard in desktop, since they hinted it won't be in Granite Ridge (Zen 5).

Intel is likely to copy AMD and pursue big 3D caches. Probably the Adamantine L4 cache, but not in Meteor Lake. What AMD hasn't done is stick any big caches in mobile other than Dragon Range X3D, or allow it to be accessed by integrated graphics. If they do both, it can have a much bigger impact than squeezing a few more frames out of a 4090.
 
  • Informative
Reactions: Brain Problems
I'm upgrading my video card and I ended up going with AMD again, and have a 7900 XTX on the way due to Nvidia's BS pricing tiers, and I didn't like how 4070 ti's are scant with ram and memory bus size.

All of the creative stuff I do doesn't involve AI and I'd rather have better raster performance and gobs of RAM rather than ray-tracing.
 
Back