No, but I want the United States government to leave others alone and take care of their own country. I mean, America isn't even a good "I want to move there when I grow up" country anymore.
The people seem to be allienated by their politicians, culture devolved to consumerism, and what's not normal anywhere is accepted as normal because they care about freedom.
Obviously I don't want Europe or Asia to suffer, but maybe this entire Ukraine War will wake their government and people that the United States should be more than just their military.
Their should be #1 at everything, not just at murdering foreign people for profitable reasons while brainwashing their people to accept their country's wrongdoings.