- Joined
- Oct 15, 2021
Not sure if someone has said this yet, but The Boys. On the surface it doesn't seem woke but half the show is demonizing Christians/Conservatives/Straight, White men, with little in terms of reciprocal satirization of left wing people. Also it pulls that "girls get it done" bullshit they were literally making fun of Marvel for doing.