may be it ain't much like a myth to many people but here in this tiny lil town of mine this myth has a long history as people who 'd settled/worked/visited US at some phase of their lives they surely sound/seem/act/live in a better way than the rest of the flock here.So, my naive question was to the ones who r from an exotic back ground from some developing nation n now r working/staying in US, is US really tha' much of a better place to live one's life in?is the gawdy glossy image of US one gets through media is true? thanx in advance to the ones who reply.
2006-10-31
21:36:38
·
3 answers
·
asked by
Anonymous
in
Society & Culture
➔ Cultures & Groups
➔ Other - Cultures & Groups