But this all assumes that an AI apocalypse is incoming and it isn't. Not in the way people who feel inherently threatened think it is, anyhow.
The fact of the matter is everyone who wasn't some form of manual laborer skilled or otherwise had this idiotic wet dream that AI when and if it hit would replace people who were manual laborers first. It's become abundantly clear that - as should've been obvious all along - it will inevitably begin 'replacing' people who work in informatic areas of life. As in, areas that have to do with information, from art to music, from movie-making to codebase engineering, from managerial duties such as scheduling to eventually, yes, politics. The real reason for the panic is that it's started I assume to hit the political and financial classes that they too will inevitably be out-moded by a superior means of managing these affairs.
But that doesn't mean everyone involved in these sectors disappears overnight nor does it mean that their jobs are 'replaced', merely changed, along with everyone else's. AI isn't some techno-deity any more than it is just some upjumped furby given sleek marketing. As much as the latter might've been the case until recently it's making strides enough to make it apparent that it can and indeed will do some jobs increasingly more accurately, efficiently, and cheaply than human laborers will. The flipside is that by the way AI tends to work, if in any situation where it is depended upon for iterations over time checked against previous iterations - such as scheduling or management - it'll have to have overseers that will take checks of its output manually in a way that isn't confined to a rigid viewpoint.
This is not too different than having a human rider on a horse. In a weird way.
What you'll see is that people will eventually, when the mistakes pile up and this becomes the obvious state of affairs, become hired en masse probably as effectively clipboard-comparison units. Where you stand there looking at a robot doing a thing all day and if it performs iteration 1,001 of a planned 10,000 iterations slightly differently, you (the guy with the clipboard) will presumably have some 'stop button' that you hit to reset the thing to its state in iteration 998, and then a 'send report button' where it scans what was going on in the AI at that point in time and sends it to people who then act as the 'AI management team' who try to find out what the hell caused your McBot to act differently.
Jobs in the future will simply be more boring and less intensive, and ironically much more friendly to people who are unmotivated and/or low IQ across all domains. This will be life at the bottom rung for most people, just standing there in some way shape or form as an autonomous human 'bug report scanner' to report bugs to people who used to program as a job but now do debugging as a job.
Similarly creatives will go from mostly actively creating to having AI generate stuff then editing it.
Similarly politics will go from people actively drafting policy to having AI draft bills and such and editing it.
And militaries will become the same thing. Human oversight, AI labor, humans acting as weird autonomous debug nodes in a greater autonomous system.
If it's any kind of apocalypse it's only one because everyone essentially becomes the bored security guard waiting for his shift to end at the quiet grocery store. Most people will just have incredibly boring jobs. Repeat this structure for everything from fast food to entertainment to politics to art to videogame development to fucking driving.