GPUs & CPUs & Enthusiast hardware: Questions, Discussion and fanboy slap-fights - Nvidia & AMD & Intel - Separe but Equal. Intel rides in the back of the bus.

I've been playing around with Qwen 3.6 (35B A3B) locally and it is surprisingly capable for my simple scripting needs. Every PowerShell script it has generated has run right the first time, which is more than I can say for the Copilot crap I am allowed to use at work. It's not blazing fast on my AMD gou but it is getting the job done.
 
I've been playing around with Qwen 3.6 (35B A3B) locally and it is surprisingly capable for my simple scripting needs. Every PowerShell script it has generated has run right the first time, which is more than I can say for the Copilot crap I am allowed to use at work. It's not blazing fast on my AMD gou but it is getting the job done.
May I ask what sort of resource you're running it on? GPU, RAM and CPU?
 
So...I didn't see a great place to put this but since it's related to case fans which are kind of enthusiast hardware and I didn't see anyone anywhere on the internet mention this: fuck putting this shit on reddit, the farms gets a super obscure exclusive.

HOW TO FLASH LIAN LI UNI HUB AL V2 AND SL V2 FAN CONTROLLERS WITH THE WRONG FUCKING FIRMWARE AND HAVE IT WORK JUST FINE

Many moons ago I got some Lian Li AL140 V2 Uni Fans, and wanted to daisy chain them so I picked up a controller minus the fans that came with it.

I didn't realize I picked up an SL V2 controller. These controllers are at least notionally not compatible with the AL fans. Everything but the fan-to-fan daisy chain cables fit though.

I'm getting ready to actually use the fans nearly 3 years later when no spare parts are available and discover this incompatibility, silly me, thinking a "Uni Hub" would be fucking universal for fans that can plug into it (they all use the same plugs except for the fan to fan daisy chain cords specifically with the remora connectors) and all the boxes LOOK the same minus the one letter change on the outside of the case. I plug it in anyway just to see what happens and it kind of works, but because the controller is clearly sending a signal intended for a specific RGB pattern the AL v2s look all fucked, though I can actually manipulate things kind of.

This gets me thinking: there is exactly zero chance Lian Li made multiple boxes all of which are incompatible with each other on a hardware level, right? Right?

What if you can just flash the firmware from the fan controller you need on to the fan controller you have, and it will just work?

I noticed that Lian Li has a manual update tool (attached from their google link from this page https://lian-li.com/l-connect3/l3_update_manually/) however if you use that tool it checks the bin files and won't let you do a switcharoo. Boo.

That said some googling led me to this thread on overclockers: https://forums.overclockers.co.uk/t...troller-with-failed-firmware-update.18951079/

Someone bricked their AL v1 controller and Lian Li sent them a tool to help fix it. They then uploaded that straight onto the internet. I immediately noticed this tool was vastly different from the one provided on the official Lian Li site. These are the instructions provided:
1776583813282.png
1776583835299.png
1776583852083.png
1776583884933.png
Things are misspelled, I see two dozen different ways to interface with a device, they're saying you need to kill antivirus, there's a whole ass chip erase feature: perfect. I've got nothing to lose anyway, the controller may as well be useless as is for these fans unless you want an intentionally broken looking light show.

Adapting this set of directions I connect the SL V2 controller without any fans connected to it, make sure L-Connect is not running, and come up with this set of steps:
1) Select USB, ok
2) Ignore steps 2 - 6, that's all troubleshooting and verification of the problem this user was having.
3) Figure out how to get that Flash Upload/Download dialog to pop up (tools menu IIRC).
4) Select "dump to file" and save your shit to a bin, just in case (the zip from Lian Li's website with the gimped flash tool has all the current firmwares on it already but better safe than sorry).
5) Finish dumping.
6) Untick dump and tick chip erase, program flash, and verify flash with BIN file.
7) Select the BIN for the controller you want to have, in my case I had SL V2 and wanted AL V2 so I found the latest AL v2 controller bin and selected that.
8 ) When finished, hit cancel.
9) Reboot computer.

Viola, L connect showed it as an AL V2 controller and the RGB tinkering worked perfectly.

I cannot guarantee this will work with anything but the AL and SL v2 controllers, but I suspect any controllers that look like they should be compatible with each other probably are (I bet you can do this with AL v1 and SL v1 controllers for example). Use at your own risk.
 

Attachments

Last edited:
Those are essentially post-processing effects. I asked if you had examples of things like that where it affected the state of the game without the CPU doing the actual physics calculations of forces, bounding boxes, collisions etc. Like where the GPU communicates back and changes the game state. It's not a gotcha, I'm curious.
Anything that affects a mesh isn't a post effect, and everything in the current frame can potentially interact with the player. Whether it does depends on the game itself. There really isn't a line between "graphics" and "game state;" ultimately all the game state is for is to affect what's on the screen. The player reacts to what's on screen, gives an input, some underlying data gets updated, and this determines what gets drawn on screen next. Maybe the waving grass just looks pretty in one game, and in the next, it obscures the vision through your scope as you lie on a hill. In one game the water waves just look pretty, and in the next, they affect your boat handling. In one game, a physically realistic lighting engine is just for looking pretty, and in the next, using your flashlight and NVGs effectively are a key part of night raids. In one game, the cloth physics are just nice to look at, and in the next, you're trying to snipe someone through curtains on a breezy day.
 
I thought about getting one of those for the 32GB of VRAM but eventually decided against it because of noise. How noisy is the 9700? Maybe I'll reconsider if it's not that bad.
I wish I could have gotten my hands on the AI Pro R9700 too but my options were sadly limited trading in through Newegg. It's not loud at all, uses less power than my 6900XT did too.

Here's what I was looking at, trying to get it running:
His performance is about twice mine, I only get up to 40 token s per second. There's probably stuff I need to tweak in Lemonade Server.
 
Ironically my new system I built in December is running on Crucial DDR5 because it was literally the only thing I could get for under $300, and one of my NVME's is a Crucial drive because it was on sale. What sucks is the engineers at Micron/Crucial make good consumer products and in my history of building computers, always have. I've always liked the sleek minimalist look of Crucial kits over Corsair Vengeance RGB lightshow bullshit.
I got the same kit. DDR5-6400 CL38 in black, from Best Buy for 292 before taxes. Still currently the cheapest 32gb DDR5 kit. $359 all day long at Best Buy still.

I also probably have the same drive - Crucial P510 2tb, paid $139 for it.
also also AVOID ADATA NVME'S FOR FUCK'S SAKE, their fail rate is pretty high, their ram sticks are decent though, still preddit at least knows part of their shit with regards to both NVMe's and SSD's, fucking retards recommending ADATA... insanity.
I have an ADATA Legend 800 1tb, it's over 5 years old, and still runs great to this day. My Silicon Power is the one I had to RMA but the replacement has been going good for 2 years.
 
Lmao, maybe I should look at what I type from time to time. You are correct.
Well you only wrote it once, but you read me write it three times! :story:

So if you're doing this with just 16GB of VRAM that's quite impressive. You say the Reddit poster is getting twice your performance. Twice is a little higher than I might expect but he has a Blackwell architecture which is about as good as it gets for LLM use right now. It's not pure hardware advantage - the fact that software is better optimised for Nvidia is a factor. But even if that were equally mature for AMD it would still be notably better hardware. It has its actual Tensor cores vs. the 9070's "AI units" and it also has around 25% more bandwidth in the VRAM which affects things.

The good news is that your performance will continue to improve as AMD software catches up. It wont reach the Nvidia performance though. Are you using ROCm? And are you on Windows or Linux?
 
The good news is that your performance will continue to improve as AMD software catches up. It wont reach the Nvidia performance though. Are you using ROCm? And are you on Windows or Linux?
I'm using Lemonade Server on Windows 11, with the Vulkan backend (as opposed to the ROCm backend) for llama.cpp. I suspect some of that difference between my results and the 5070Ti is the Cuda advantage, and some of it is from me inexpertly duplicating the config in Lemonade.
 
I'm using Lemonade Server on Windows 11, with the Vulkan backend (as opposed to the ROCm backend) for llama.cpp. I suspect some of that difference between my results and the 5070Ti is the Cuda advantage, and some of it is from me inexpertly duplicating the config in Lemonade.
Well on this subject I am not an expert but it may interest you to know that I have just tried this out on a RDNA3 card and I to around 40t/s and not much over. I have just tried with both ROCm and Vulkan for comparsion and in at least my limited testing saw no noticeable difference. If anything ROCM performed worse. I have slightly more VRAM than you so I offloaded a little less to the CPU but weirdly if anything that my t/s. Maybe the quantisation?

I will say that you're right and it's a nice model for Powershell. The test script I was having it make did have a couple of minor errors in it but it was well done other than that.

I only had very limited time to play around with this but thanks for raising it, even though I couldn't help you, it's been interesting to play around with. It really shows off that you can still do useful work with 16GB of VRAM which I hadn't really considered to be the case.
 
Actually speaking of stupid x3d questions, do x3d CPUs even work on Windows 10 iot? I know a lot of us are running that version of windows 10 and x3d compatibility is the only thing which gives me pause looking at a 9950x3d.
 
Back
Top Bottom