When has a cool technology turned out to be cool? We know that this LLM will be used to monitor speech.LLMs and (near?) real-time text-to-speech can be used to make NPCs with dynamic conversations. You can find demos of this. Other structured data like quest lines could also be generated by LLMs. Yeah, this could all suck, but it could be cool if it works.
No way will I be able to say to the darky female blacksmith "make me a better sword you fucking nigger bitch" without getting banned immediately from PSN and my bank.
Would have been very cool in the 90s or early 2000's when devs could make games. Except, these days, it will be used for more adverts. spying, shite writing and overly-bloated, indianfied slop.It sounds like 8-10 GB is the minimum amount of RAM you would want to set aside for these AI/LLM functions, on top of around 24-32 GB for everything else, up from 16 GB of the current generation. So that's how you get estimates like 36-40 GB. Using LPDDR5X, it won't be as expensive as GDDR6 used to be.
If a developer doesn't use AI, they get free RAM to use elsewhere. No developer is going to say no to 36... 40... 48 GB of RAM instead of 24 GB, which would be the absolute minimum cheap-out route you could predict for next-gen.
Maybe I'm just jaded.