Don't say you've done your research unless you really, truly have. Also, hearing it on TV or learning it in school doesn't count, because they are government controlled (or at least influenced) entities.
Isn't this claim hypocritical coming from someone that was born and spent their entire life in the U.S.?
Why do people so readily accept this without actually knowing if it's the truth?
Since I know people will want to attack me, I'm going to put the disclaimer up right now: I am not saying America is the best or that it's not or that it's good or bad. I am only asking if it is hypocritical to make that claim, and why people so readily accept it as truth.
2006-08-09
19:41:18
·
13 answers
·
asked by
Brianman3
3
in
Politics