I can see this now: people are going to be (ven more) delighted and nostalgic over the '80s, '90s and 00's. They will claim tha those times had the perfect balance of technology and rea life interaction. If anythong, they would miss technology not having the power necessary in running AI.
People will be completely immersed in their AI everything, if they will want to live the '80s--'00s, they can. The AI will offer them endless fantasies and society will be incredibly fractured as a result. I was thinking about this the other day. I was watching "as good as it gets" with my wife, it's a drama/romantic comedy with Jack Nicholson and Helen Hunt from the 90s. It's not a bad movie but also not exactly the kind of movie I would watch but I watched it the first time in the late 90s when it was on TV, all by myself. Why? Because it was on TV and there was nothing else on, and hey, Jack Nicholson. That's all the reasoning I needed. If you are of a certain age, you made this choice at some point in your life too, probably several times. Read a book, watched a movie or a TV show which is absolutely not your usual preferred genre, because of simple boredom and nothing else to do. Many other people probably saw it and other movies for similar reasons, and through that we all absorbed things that were not exactly up our alley, but still, you know, expanded our views and unified "culture consumption" in some ways. The scarcity of media to consume made bigger groups of people share the same cultural outputs. Everyone knew the same movies, news anchors, sitcoms, talk shows, comedians etc.. because we all watched them. There were not that many.
Now everyone is in these hyper-niches with extremely specific interests and views and Group A only watch Blingblong on Youtube while Group B watches Mr. Blangbling on TikTok and whoever all these people are (I don't know them, I don't partake) and I have no idea what these people even talk about when they mention random youtube/twitter/tiktok handles as authorities on super niche topics and talk about it like it's some kind of religion (happens even on here). This is not even including politics, where people just move in bubbles and do not even privately interact with people that don't share the exact same views as them anymore, something social media is basically designed to reinforce. This is what making our society so incredibly fractured. We have less and less common ground because we don't share the same cultural outputs anymore. This is also why mass appealing media is so terribly generic these days. They struggle to reach the biggest possible groups of people, which is just impossible now because we all moved so far apart from each other.
AI will turn that up to eleven by creating bubbles of one. People will eventually mostly only interact with AIs and AI cultural output. (and yes, I am sorry artists but AIs will eclipse you both in creativity and cultural importance, if you don't learn to draw paintings in two seconds and write entire books in five, it is inevitable) AI will produce culture targeted not at demographics, but at individuals, and people will LOVE it because they will feel validated and heard, no matter if they're a Hitler enthusiast from Alabama or non-binary genderblob from California. This will give AIs immense power, because mastering cultural language and controlling cultural I/O like this is like getting access to the source code of society. Who will wield that power, I do not know, but we see a race going on for it right now, hence my earlier comment about companies basically literally shoveling piles of money into ovens if it just means three percent of improvement. They know this will happen and they want to be on top when the cards get reshuffled.
If you want to have my personal belief, I hope AI comes out on top in that power struggle. I feel there is a good chance for it, too. We call them artifical intelligences. Alone the word "artifical" sounds like... controllable, off the assembly line and factory made, "just like the real thing", when the reality of the situation is that even the simple models we create now we don't really understand internally and in their finished state are complete black boxes to us. (and yes, it really is so, this is not hyperbole at all - that's why companies struggle to make them "behave", be it in a chat or when driving a car) I think in the best case scenario we will eventually overstep a threshold. Nobody will quite know when exactly we stopped running things and they took over, nor will we particularily care at that point because life is just that good. This would be the good ending.
So yeah, tl;dr don't hope for a day of people tearing down all of this, IMO. There will be no meaningful cultural resistance, quite the contrary, most will welcome this. It might not be for the worst, in the end, if we get lucky.