On several questions I have seen posts concerning how American's get their historical view of how America won WWII, saved Europe, England... defeated Germany... Some answers say these beliefs are from Hollywood making movies portraying Americans in the Savior role, even having movies made that show the US in battles that they were not in.
My question is: What movies, that are stated as historical (vs. a story like Saving Private Ryan, that just happens to be set in WWII) concerning which battles, did Hollywood make that would make others think that Americans get their view from Hollywood vs. a history book?
I've read history books, I've talked to WWII Veterans, and I've seen WWII movies. Movies are entertainment, not history. Some are historical, but its still a story.
I see the 'accusations' but I never see a reference. Anyone have one? I'm not saying there are or are not any, I'm just curious.
2006-08-06
00:56:57
·
6 answers
·
asked by
Michael
3
in
Military