GPUs & CPUs & Enthusiast hardware: Questions, Discussion and fanboy slap-fights - Nvidia & AMD & Intel - Separe but Equal. Intel rides in the back of the bus.

  • 🐕 I am attempting to get the site runnning as fast as possible. If you are experiencing slow page load times, please report it.
1741247500066.png
Note: regular 9070 is a solid deal for me. Great performance for less energy consumption is the right balance.
 
Last edited:
I bet that modernizing an architecture, like the Hitachi SuperH, into a multi-core desktop processor, would shag x86 to death, in the right hands. I bore, I grow weary, of these desktop CPUs. Where's all the fun shit? Why didn't PowerPC try their hand at the Desktop space, beyond late-stage Amiga and Apple? Why won't Apple sell me their fancy CPU and get some board partners? I need some diversity in this market.
 
Will drm reach a point where GPUs have a built-in m.2 slot for a dedicated SSD that encrypted game files are installed to?
 
  • Thunk-Provoking
Reactions: BIG BILL HELL'S
I bet that modernizing an architecture, like the Hitachi SuperH, into a multi-core desktop processor, would shag x86 to death, in the right hands. I bore, I grow weary, of these desktop CPUs. Where's all the fun shit? Why didn't PowerPC try their hand at the Desktop space, beyond late-stage Amiga and Apple? Why won't Apple sell me their fancy CPU and get some board partners? I need some diversity in this market.
This vaporware is interesting and could be used by x86:
Startup Says It Can Make a 100x Faster CPU - Flow Computing aims to boost central processing units with their ‘parallel processing units’
IIRC it's 100x (their claim) only for software rewritten/compiled to be optimized for it, but can achieve a 2x speedup without any work.

You are getting AVX-512 and more cores (rumored 24 cores for Zen 6), but the real action is happening with integrated graphics and NPUs. We might get room temperature integrated quantum co-processors in desktop CPUs before you stop being bored.

Will drm reach a point where GPUs have a built-in m.2 slot for a dedicated SSD that encrypted game files are installed to?
I don't know about your DRM point, but if done right this could reduce latency and improve performance. Copy all necessary files to a 256 GB SSG and let it rip. AMD released a pro SSG card but hasn't done much with it since. The concept might get revived for AI accelerators, and SanDisk is trying to get NAND Flash used in place of HBM. There are already consumer cards on the market with an internal M.2 slot but it's just for convenience when PCIe lanes are not used, nothing to do with speeding anything up.
 
I bet that modernizing an architecture, like the Hitachi SuperH, into a multi-core desktop processor, would shag x86 to death, in the right hands. I bore, I grow weary, of these desktop CPUs. Where's all the fun shit? Why didn't PowerPC try their hand at the Desktop space, beyond late-stage Amiga and Apple? Why won't Apple sell me their fancy CPU and get some board partners? I need some diversity in this market.
Nah.
Back in the days, architectures like Alpha utterly crushed x86, but x86 has had a lot of development since, while Alpha has been dead for two decades. Modern x86 processors aren't actually "x86", they're highly optimised RISC processors running microcode that emulates x86. It takes in a bunch of x86 instructions, breaks them down into steps, arranges those steps such that as much work as possible is parallelised, and then runs those steps while it takes in a bunch more x86 instructions. You could bring that same pipelining to a RISC architecture, ARM does that, but modern ARM probably shouldn't really qualify as RISC, it's got too many vector instructions bolted to its side.
It's convergent evolution. CISC architectures like x86 got more RISCy, while RISC architectures like ARM got more CISCy. This hybrid approach is just more optimal for real-world workloads.
IIRC it's 100x (their claim) only for software rewritten/compiled to be optimized for it, but can achieve a 2x speedup without any work.
Itanium boasted similar performance peaks. The reality is that a compiler clever enough to optimise code for novel designs like VLIW would basically be sentient.
 
Why didn't PowerPC try their hand at the Desktop space, beyond late-stage Amiga and Apple?

The reason they didn't: Consumer products don't have the margin enterprise products do. IBM spent a lot of time throwing off products that were profitable, but didn't make enough margin. Companies tend to be geared toward high margin or high volume, but rarely both.

The reason they don't now: Power has moved away from being a desktop architecture and toward being purely server-oriented. They aren't very fast at single-threaded tasks, so when I've had to plug a monitor and keyboard into a Power server, it's been a painful experience. But when you have multiple threads on a batch level, nothing beats them. We have a 40-core Power server that gets more work done in a day than our 128-core EPYC server. And they are rock-solid, that server has NEVER gone down.
 
Can't add a video for some reason but 9070 non xt cards actually seem like a decent buy if you're into power efficiency. Shame all the online retailers are already sold out and the nearest microcenter is 4 hours away.


Thank God my job let's me touch grass or this would all be very depressing.
 
Couldn't pick one up because I live in a shithole area and the stores had none in stock. However, was able to get this ordered:

ASUS Prime Radeon RX 9070 xt OC 16GB GDDR6 Video Card​

Hopefully the current power supply cables I have will work with it but if not, I've time to order a new set. Looking forward to increased frames.
 
Last edited:
Oh look, AMD once again talks a big game before launch, the techtuber space talks a big game sucking AMD's cock, and then, when the day finally comes for Lisa Su to save gamers, AMD once again shits its pants spectacularly.

Bow before the Lord of GeForce, peasants, and earnestly pray that he deigns you worthy of BLACKEDwell
1741279515353.jpeg
 
Oh look, AMD once again talks a big game before launch, the techtuber space talks a big game sucking AMD's cock, and then, when the day finally comes for Lisa Su to save gamers, AMD once again shits its pants spectacularly.

Bow before the Lord of GeForce, peasants, and earnestly pray that he deigns you worthy of BLACKEDwell
View attachment 7061544
Obvious bait is obvious.
 
Few updates by Apple:

1. M4 MacBook Air, 16 GB RAM, 256 GB SSD, starts at $999 US.
2. M4 Max Mac Studio, same chip as in the high end MBP last year.
3. M3 Ultra Mac Studio, with up to 512 GB RAM.

M2/M3 MBAs are discontinued as a result.
Imagine paying $1000 for that Air and getting less than $20 worth of storage with it.
Seems like the 9070xt is a decent card whose price point is only enticing due to current year.
I remember when the 5700xt seemed steep at $400. Those were the days...
 
  • Lunacy
Reactions: Brain Problems
5000 series turning out to be a total scam. You couldn't pay me to get one at this point.

The PhysX gimping is exactly why DLSS (and all its offshoots) are garbage. There's no guarantee of long term support.
It's sad PhysX never reached its full potential. Games today have worse physics than games 10 years ago.

At that point, you would be better off using an external GPU enclosure that can accomodate the card to a better degree. You would of course need to use an external connector like Oculink or one of those weird NVME connectors sometimes used in servers.
Mini cool edge MCIO is the server PCI cables.
Very fragile.


Someday I'd like to upgrade from my 3070. The 8gb of vram was low when it launched and it's really low now.
 
  • Agree
Reactions: Bob's Fuckass Head
Back