As an American, it bothers me that our government (and I put the blame on BOTH major political parties) is making America look bad by invading other countries, condoning torture, etc. I took a class in college on US foreign policy from 1900 onward and my professor did a VERY good job of remaining neutral presenting the facts. I was absolutely appalled by how the US invaded other countries, sent CIA agents to assassinate people only because they had socialist leanings, etc. Actually, it appears the only justified war the US has engaged in since WW2 is Afghanistan, and that's only because they harbored an organization that actually attacked us. Anyway, enough of that tirade...
I am daily relieved that many people in the rest of the world seem to direct their anger only at the US government, not the people of the US. It gives me a sense of hope that things can get better still. So, how can the US heal relations with the rest of the world?
2007-01-29
12:42:44
·
7 answers
·
asked by
Steady As She Goes
2
in
Politics & Government
➔ Other - Politics & Government