On several questions I have seen posts concerning how American's get their historical view of how America won WWII, saved Europe, England... defeated Germany... Some answers say these beliefs are from Hollywood making movies portraying Americans in the Savior role, even having movies made that show the US in battles that they were not in.
My question is: What movies, that are stated as historical (vs. a story like Saving Private Ryan, that just happens to be set in WWII) concerning which battles, did Hollywood make that would make others think that Americans get their view from Hollywood vs. a history book?
I've read history books, I've talked to WWII Veterans, and I've seen WWII movies. Movies are entertainment, not history. Some are historical, but its still a story.
I see the 'accusations' but I never see a reference. Anyone have one? I'm not saying there are or are not any, I'm just curious.
2006-08-06
00:56:57
·
6 answers
·
asked by
Michael
3
in
Politics & Government
➔ Military
This question is in reference to those who believe that Hollywood has in some way shaped the history books into tricking American's thinking of how the war was fought and won. I realize the morale building implications, and I agree with them. I am looking for actual references from people who blame hollywood for Americans thinking of how the war was fought and won. I firmly believe that if it was not for American support before the war to England, Russia, China before 8 Dec 1944 (When the US Delcared war and officially entered WWII) that Germany and Japan would havedominated more of the world than they did. England especially, I believe would have fallen had it not been for US money, weapons, supplies and pilots that went over there to help protect the Islands.
I'm looking for references to people's claims that America did less than the history books state. That battles fought and deeds done were created by Hollywood and not actual occurences.
2006-08-06
02:37:42 ·
update #1