English Deutsch Français Italiano Español Português 繁體中文 Bahasa Indonesia Tiếng Việt ภาษาไทย
All categories

If you look at the history of Hollywood, it is the cavalry saving the waggon train, to Rambo sorting out those sneaky commies all on his own, to Ronald Regan coming up with his "Star Wars" plan.

Is that last one a leap too far? Or, is it now part of your national psyche that "Truth and Justice in the American Way" will prevail?
Could this view actually affect foreign policy?

2006-12-13 10:48:16 · 3 answers · asked by Anonymous in Society & Culture Other - Society & Culture

3 answers

No, I don't think Hollywood has affected our country's view of it's place in the world. I do think they think they do.

2006-12-13 10:51:54 · answer #1 · answered by Monte T 6 · 0 1

y of course. especially the younger people who don't really know who they are or how they fit in. They tend to try and mock what they see and think is cool but have no idea that the lives a lot of hollywood ppl live is so unrealistic for a normal person

2006-12-13 18:58:39 · answer #2 · answered by oohLa 3 · 0 0

Yes.
It tells us that we saved the world and that the world would not be the same without us. We are the international saviors and we saved everyone from tyranny, mainly Hitler's tyranny.
Hollywood just enhances American egoism.
That's why I don't like or trust Hollywood.

2006-12-13 18:50:53 · answer #3 · answered by . 7 · 2 1

fedest.com, questions and answers