What the fuck has gone wrong with America?
When I was growing up, the United States seemed like the best country on the planet. It was the very best of a democracy, where freedoms were a given and where, given effort and willing, a man could make something of his life and enjoy significant rewards.
Today, the country looks like a hell-hole. Poverty, crime, massive reductions in freedom and liberty, once proud cities almost totally derelict...
The country is governed by a racist, bigoted, thin-skinned sexual predator. A man who's incompetence is only matched by his ignorance.
America, once the "policeman of the World," exerts its influence globally. But it does this not for peace, democracy or freedom. It does it only so it can bleed "managed" nations dry of their natural resources, with no remuneration of any sort for the indigenous populations (apart from massive debt).
I hate what America has become.