I know Americans are ignorant of other countries histories, laws and all that, but why are they so ignorant of their own laws?
Is nothing taught in schools? I don't mean college either, I literally mean high school. It's weird, in other countries we learned about important issues and laws specific to relevant geopolitical areas. In high school. We went over the constitution, basic american civil law, its history and the history of important civil rights events.
This is all besides the fact that they aren't secret police. Nor is using the feds some verboten action. You go nuclear, you'll be treated like any other terrorist, Ryan, you cock goblin.
BTW, Ryan's Reddit is a fucking blast
https://archive.vn/KgP7j he gets into "Toxic Masculinity" in one of the first posts.