I think they're starting out with these "AI engines" in mobile to kind of see how it plays out. If evaluating lots and lots of many-parameter functions really does become the next big thing in computing, it'll be deployed in every desktop CPU soon enough. If ChatGPT ends up being a huge fad, then no worries, they didn't go all-in. Intel went all-in on AVX-512 prematurely, and it ended up bein a hot, expensive waste of silicon (in part because, again, most programmers are too retarded to write SIMD-friendly code).
I can almost guarantee on-die RAM is coming. The reason it's not going to be in Meteor Lake or Strix Point is the basic design parameters for those chips were locked in ~2 years ago, so there were still a lot of question marks around whether Apple's move was the right one. And, as it turns out, it was the right move. Most people do not, in fact, ever upgrade their RAM. The US PC (meaning all desktops + laptops, not Windows) market is down 7.3% YOY, but in the same period, Apple's sales were up 18%. I don't have to have any inside info to know that everyone else has noticed that computers with no dGPU and fixed RAM are growing like crazy and are telling Intel and AMD they want a competitive product.