As an American raised abroad, I have always wondered why Americans always talk about being the only country in the world that has "freedom" and they always say other people come here to enjoy this freedom. I really don't understand where this myth comes from. I don't think the US is actually a free country. I think there is a lot of repression and lots of laws that shouldn't be there. From what I see when I come to visit here, there is a lot of brainwashing going on at schools and kids grow up thinking that the US is free, democratic and that all the wars are for the benefit of the world, etc....All this seems ridiculous to me. I have been to a few third world countries and they all were much more free and democratic than the US, not to mention they were better places to live. Maybe I should feel glad that I was raised in one of these so called "uncivilized third world countries"..Anyway, it's kinda hard to understand my own people sometimes when I come to the US on vacation.
2007-02-06
23:19:49
·
21 answers
·
asked by
bocona49er l
1
in
Politics