- Joined
- Dec 28, 2014
- Highlight
- #64,481
This even implies that Nazi Germany is even taught other than as something that was bad. There is no effort whatsoever at trying to explain exactly why Nazi Germany even came to be. I am old enough I had a history teacher who would get really enthusiastic and explain the details of World War I and the stroke of Woodrow Wilson resulting in the Versailles Treaty (instead of the more moderate policy Wilson preferred until his brain died because of his stroke) and the subsequent resentment of the unfairly treated German people resulting in the obvious outpouring of rage that resulted in Hitler coming to power.It doesn't help that the people who claim "TRUMP'S AMERICA IS LITERALLY NAZI GERMANY" only know of Nazi Germany through public high school history classes or pop culture as well as these people being the most privileged assholes in the country who never experienced any form of societal oppression beyond their parents telling them no.
A lot of people act like Hitler was just some demon randomly sent from Hell, instead of the inevitable result of very bad decisions made by people most of whom were not even Germans.