GPUs & CPUs & Enthusiast hardware: Questions, Discussion and fanboy slap-fights - Nvidia & AMD & Intel - Separe but Equal. Intel rides in the back of the bus.

This sounds a tad overengineered, are you doing all this for privacy reasons or just because you felt like it? there are no-subscription OEM cameras with AI now (well, one brand).
"overengineered" implies "engineered" which is certainly not the case as it's held together with duct tape and baling wire.
Partially because every camera does everything differently, and until recently cameras didn't have these functions.
And this way I control my own destiny... and annoyance level.
 
When did you set this up?
Looks like the oldest camera is 2016, at the time they just wrote to the NFS share. But I wasn't exactly going to buy all new cameras.
I started out doing the inference on the Pi CPU at 4 seconds or so in early 2020.
Looks like I switched to the NCS2 in Late 2020

From my purchase history, looks like the first camera I purchased that mentions Human detection is late 2022.

With my weather I replace about one a year.
 
  • Informative
Reactions: cybertoaster
I did some research on the board I'm getting. Pretty sure I need a TPM chip because it has a TPM header. Recommendations?

Also I bought the USB card today
chrome_screenshot_Dec 12, 2023 6_25_01 AM MST.png
So many ports, I'll have C as well as 4 USB 3.0 ports. Not running out of slots soon I reckon
 
I did some research on the board I'm getting. Pretty sure I need a TPM chip because it has a TPM header. Recommendations?

Also I bought the USB card today
View attachment 5560430
So many ports, I'll have C as well as 4 USB 3.0 ports. Not running out of slots soon I reckon
Does the board already have TPM in some fashion? You probably don't need it but Windows 11 wants TPM 2.0
 
  • Informative
Reactions: WelperHelper99
Does the board already have TPM in some fashion? You probably don't need it but Windows 11 wants TPM 2.0
I'm going to have to check when I get it, it doesn't say much more other than having a TPM header, and having a bunch of pins like it'll fit a TPM chip. It's being given to me for Christmas, so I can't completely check rn.

The board itself was a bit cheap, but solid, I wasn't the one buying it so I had to budget it
chrome_screenshot_Dec 7, 2023 7_38_44 AM MST.png
It's gonna work well for the rig though, since everything else is last gen and not top tier
 
I'm going to have to check when I get it, it doesn't say much more other than having a TPM header, and having a bunch of pins like it'll fit a TPM chip. It's being given to me for Christmas, so I can't completely check rn.

The board itself was a bit cheap, but solid, I wasn't the one buying it so I had to budget it
View attachment 5560466
It's gonna work well for the rig though, since everything else is last gen and not top tier
I have this exact board on another machine, yes it has tpm ootb.
 
  • Informative
Reactions: WelperHelper99
I have this exact board on another machine, yes it has tpm ootb.
Well that's good to know. It isn't exactly advanced, so I wasn't sure. Though I'm glad it isn't super advanced, it still has a PS2 slot in the back for any older keyboard since I'm probably just gonna get one at a thrift store
 
Well that's good to know. It isn't exactly advanced, so I wasn't sure. Though I'm glad it isn't super advanced, it still has a PS2 slot in the back for any older keyboard since I'm probably just gonna get one at a thrift store
I have been using it for my living room media/arcade/steam PC. My only complaint is Nahimic, but it's a non-issue.
 
  • Informative
Reactions: WelperHelper99
I have been using it for my living room media/arcade/steam PC. My only complaint is Nahimic, but it's a non-issue.
That's the sound, right? Honestly don't care too much, I've been used to ASrock sound forever given previous rigs I've built have always used them.

But yeah that's basically what I have planned for it. Be a entertainment center, doesn't need to do anything crazy
 
That's the sound, right? Honestly don't care too much, I've been used to ASrock sound forever given previous rigs I've built have always used them.

But yeah that's basically what I have planned for it. Be a entertainment center, doesn't need to do anything crazy
Yea, I originally had issues with it. It would randomly disable my audio output. I flashed the newest bios and tweaked some things and removed the Nahimic app entirely and it hasn't been an issue since. For a budget board it's great as long as it's a real Pro RS and not the shitty ones they branded and sold to cyberware. *Cyberpower.
 
  • Informative
Reactions: WelperHelper99
Yea, I originally had issues with it. It would randomly disable my audio output. I flashed the newest bios and tweaked some things and removed the Nahimic app entirely and it hasn't been an issue since. For a budget board it's great as long as it's a real Pro RS and not the shitty ones they branded and sold to cyberware. *Cyberpower.
Well I'll see if I have issues but that's a good heads up. And it is a pro, so that's good
 
Not entirely sure which thread to post this in, so anyway.

Question(s) for AI nerds, and please correct me if my understanding is off somewhere, because it probably is. So right now Nvidia dominates the consumer AI market, because they have a massive software ecosystem advantage with CUDA, so pretty much all of the open source AI models need an Nvidia card to run even halfway well.
AMD is trying to catch up, but remains to be seen if they will be able to. It will probably take years, and their answer to CUDA is ROCm.
Now a number of different companies are releasing dedicated AI chips, most recently AMD. Team red also says there will be AI processors on their future graphics cards.
Are these AI chips more efficient at LLM stuff than graphics cards? A yet more refined ASIC basically?
 
The big thing with LLMs is bandwidth, not necessarily processing speed and GPUs have blazingly fast and very well connected RAM. An AI-ASIC will never be that useful for LLMs if it isn't also connected (well) to a shitload of (very fast) RAM. The most interesting cards to run the more advanced open LLMs on right now are the A6000 w. 48GB And the version of the A100 with 80 GB of VRAM. (The latter costs a cool 20k) You really don't want to go lower in the VRAM department unless you want to run tiny and/or heavily quantized models which as of now are pretty retarded. With some frameworks, you can combine several GPUs (and their VRAM) to do inference. With llamacpp you can also do inference on the CPU. With a decent CPU and enough fast RAM it's actually quite bearable. (e.g. on the Mac) Bear in mind we're only talking quantized models here. Full accuracy models of decent sizes really don't fit on normal hardware you'd buy for your home.

So GPU manufacturers can effectively gatekeep LLMs by just not putting enough VRAM for them on consumer hardware. Processing speed isn't really the issue if you don't try to serve hundreds of people, bandwidth is. So in short, any AI LLM accelerators would only be interesting if they come with at least 48 GB of fast RAM onboard, which in LLM country would actually be a *really* moderate amount of RAM.

That said, smaller models get better all the time and this post might be horrendously outdated in six months.
 
Last edited:
Not entirely sure which thread to post this in, so anyway.

Question(s) for AI nerds, and please correct me if my understanding is off somewhere, because it probably is. So right now Nvidia dominates the consumer AI market, because they have a massive software ecosystem advantage with CUDA, so pretty much all of the open source AI models need an Nvidia card to run even halfway well.
AMD is trying to catch up, but remains to be seen if they will be able to. It will probably take years, and their answer to CUDA is ROCm.
Now a number of different companies are releasing dedicated AI chips, most recently AMD. Team red also says there will be AI processors on their future graphics cards.
Are these AI chips more efficient at LLM stuff than graphics cards? A yet more refined ASIC basically?


I think AMD is a lot more undeveloped on the software side before you get into hardware power comparisons. CUDA is a dream and ROCm leaves me buckbroken everytime. God help you if it involves anything new or experimental which most AI things are going to be.
 
Intel Meteor Lake Analysis - Core Ultra 7 155H only convinces with GPU performance

It's launch day for Meteor Lake, and we are starting to get confirmation that it's underwhelming and overhyped. Outperformed by Raptor Lake, less efficient than Phoenix. The full integrated graphics is probably on par or 5-10% faster than the 780M in Phoenix which is good. That leapfrogs over Rembrandt and makes it viable for 1080p gaming.

I want to see some testing of the "U" chips, but one thing that stands out from looking at the specs: the "U" chips are limited to a GPU tile with half the GPU cores. Contrast with Alder/Raptor Lake that had the same maximum of 96 EUs on the "U", "P", and "H" dies:
core_ultra_modelle.png

Also, ComputerBase says that the Core Ultra 7 164U and Core Ultra 7 134U with lower 9W base TDPs will only support LPDDR5x-6400, no DDR5 SO-DIMM support (that's reflected in the chart but easy to overlook). Maybe those are the ones with the memory on package? They are also coming out a little later, in "Q1 2024".

More reviews (all of Core Ultra 7 155H so far).

 
Last edited:
Intel Meteor Lake Analysis - Core Ultra 7 155H only convinces with GPU performance

It's launch day for Meteor Lake, and we are starting to get confirmation that it's underwhelming and overhyped. Outperformed by Raptor Lake, less efficient than Phoenix. The full integrated graphics is probably on par or 5-10% faster than the 780M in Phoenix which is good. That leapfrogs over Rembrandt and makes it viable for 1080p gaming.

I want to see some testing of the "U" chips, but one thing that stands out from looking at the specs: the "U" chips are limited to a GPU tile with half the GPU cores. Contrast with Alder/Raptor Lake that had the same maximum of 96 EUs on the "U", "P", and "H" dies:
View attachment 5565373

Also, ComputerBase says that the Core Ultra 7 164U and Core Ultra 7 134U with lower 9W base TDPs will only support LPDDR5x-6400, no DDR5 SO-DIMM support (that's reflected in the chart but easy to overlook). Maybe those are the ones with the memory on package? They are also coming out a little later, in "Q1 2024".

More reviews (all of Core Ultra 7 155H so far).

Guess I'm glad I'm going with 12th gen Intel, at least I know what I'm getting for a relatively low price
 
Back