On several questions I have seen posts concerning how American's get their historical view of how America won WWII, saved Europe, England... defeated Germany... Some answers say these beliefs are from Hollywood making movies portraying Americans in the Savior role, even having movies made that show the US in battles that they were not in.
A. Because American film makers felt that way or just how they felt the public wanted to see it.
B. Is there some truth in it but is it exgesurated
Why is it that they concentrate solely on the American effort?
Tora, Tora, Tora is probably is one of the better WW II movies, however, it made in the 70's and has only to do with the attack on Pearl Harbor.
2007-09-06
21:24:54
·
12 answers
·
asked by
stepperry2008
2
in
Military