From my POV as an outsider looking in, your whole political system has gone mental. What happened to just governing and campaigning with common sense? How the hell did the truth become optional? Why is every issue a “culture war” issue? I can’t wrap my head around it.
As someone on the inside of the US…I don’t know. We’ve always had a pretty strong strain of anti-intellectual idiocy, but the past decade or two has just been bizarre. The ugly hate and willful ignorance is embarrassing, scary, and really demoralizing.
Honest question, WTF is going on over in America?
From my POV as an outsider looking in, your whole political system has gone mental. What happened to just governing and campaigning with common sense? How the hell did the truth become optional? Why is every issue a “culture war” issue? I can’t wrap my head around it.
As someone on the inside of the US…I don’t know. We’ve always had a pretty strong strain of anti-intellectual idiocy, but the past decade or two has just been bizarre. The ugly hate and willful ignorance is embarrassing, scary, and really demoralizing.