- Joined
- May 25, 2020
America hasn't exactly died, but it's been given a terminal diagnosis. Anyone whitepilled or thinking the left will "burn themselves out" is engaged in some stage of coping or irrational optimism. There have been times when the momentum slowed, like the radical 70s into the Reagen 80s or the politicized early 90s into the ambivalent ska-soaked late 90s, but the country has been moving in one direction for a long time.
What's so interesting is seemingly everyone can agree that the country is in dire straits, and getting worse, that even outside of pure politics our culture is wretched and decaying, but refuse to accept the logical conclusion that our dominant social narratives are largely to blame. The left can't behave as if they're in power, even they though rule over every institution outside of the Senate and portions of the nation's police, because it would mean accepting responsibility for the carnage they've wrought.
Eh, once the debt bubbles all start bursting at once - the education one, the housing one (again), the social services one, the pension one, the health care one, so-on, so-forth, I think you'll see troubling times for the nation but at the same time an opportunity to take a scouring pad to this extravagant shit. Of course American culture is dying; it's been hegemonic and bequeathed utter comfort and stability for a long, long time. An empty culture of consoomerism has flooded in instead, as people decide they don't "need" that other icky stuff anymore. Once the debt bubbles all burst, though, well - yeah, you might just need that other stuff, or you'll be ground up.