When I was growing up, shortly after WWII, I thought that the USA was a great country, a leader by example, to the other countries in the world. We stood for truth and justice, morality and compassion. Now, it seems, our governments' agendas are only to increase the wealth of the wealthy, and to bully our way around the world. People from other countries don't respect us, and daily we learn of more atrocities committed in our name.
To read some of the answers in this venue, I am chilled to the bone with the stupidity and blood thirstiness of the young Americans.
For me, everything started to change during the Viet Nam war, when we realized that we had been lied to for so many years.
How about you?
2006-12-30
17:50:44
·
11 answers
·
asked by
Joey's Back
6
in
Politics