Now I know I'm making a generalized statement myself, but why does the rest of the world feel the need to be predjudiced against Americans? Sure, our foreign policies are screwed up, we like to eat, and the unfortunate images of us that usually come across to the rest of the world are of Evangelical christians who believe that our idiot president is doing the work of God on earth. Sure we may come across as biased and racist ourselves, but far, FAR from every person in the good ol' U S of A is like that. Does stereotyping and labeling us really make you any better than your unfortunate image of us?
Just putting that out there.
2007-01-02
14:24:34
·
13 answers
·
asked by
superfly9492
1
in
Other - Cultures & Groups