English Deutsch Français Italiano Español Português 繁體中文 Bahasa Indonesia Tiếng Việt ภาษาไทย
All categories

On several questions I have seen posts concerning how American's get their historical view of how America won WWII, saved Europe, England... defeated Germany... Some answers say these beliefs are from Hollywood making movies portraying Americans in the Savior role, even having movies made that show the US in battles that they were not in.

A. Because American film makers felt that way or just how they felt the public wanted to see it.
B. Is there some truth in it but is it exgesurated
Why is it that they concentrate solely on the American effort?
Tora, Tora, Tora is probably is one of the better WW II movies, however, it made in the 70's and has only to do with the attack on Pearl Harbor.

2007-09-06 21:24:54 · 12 answers · asked by stepperry2008 2 in Politics & Government Military

12 answers

Go rent "A Bridge Too Far" Probably the most historically accurate movie about WWII ever made. And it's primary focus is on the British involvement in the war.

2007-09-06 21:49:27 · answer #1 · answered by Marine till Death 4 · 6 1

Hollywood has a tendency to portray events with their own flair; however, the movie The Longest Day was very accurate as was Tora, Tora, Tora. Other films like Saving Private Ryan took many liberties. The series We Were Brothers was also quite accurate. Of course Hollywood always put America as the savior but there was also realism in many of the Hollywood movies. Patton was mostly true but given the Hollywood touch. No doubt about it. America won the war in the Pacific almost by itself; however, the war in Africa and Europe was another matter. The Brits, Poles, French and Russians did a wonderful job in defeating the Germans. The Russians who took a bad beating from the Germans at first, turned things around and beat the Germans back to Berlin and then beat the heck out of them....but the Russian loses were extremely high. I don't believe that Hollywood made a movie about Russia's fight in the war. Americans like to watch Americans win so they can go Rah, Rah as they watch the movie.

2007-09-07 04:50:45 · answer #2 · answered by Anonymous · 1 1

I'm not American and the only American WWII that I have ever seen is Saving Private Ryan.

But I think it's not surprising that American war movies concentrate on the role of America. I think most of such movies are entertainment that don't have the ambition to be neutral documentary films but to sell, so they show what many Americans want to see and what they can identify with. It's also not necessarily a falsification of history. I mean just because a film focused on the American efford doesn't mean that it denies the role of other countries.

And there is some truth in it. I mean the US had a savior role in WWII, even though it wasn't them alone.

By the way the only three German WWII movies I know have Germans as their main characters, one is about Stalingrad ("Stalingrad") and the other two take place almost at the very end of the war ("Die Brücke"/"The Bridge" and "Der Untergang"/"The Downfall"). All these films portray Germans as victims in the situation of defeat, in a way that the watcher of the film is made to identify with them and to feel sorry for them. I mean these films of course have a very different message than American WWII films as nothing could make heroes of the Germans in WWII so the best way they can possibly be portrayed is as victims, but what they have in common with them is that they portray people as main characters that many people in the countries where these films were made can identify with.

And by the way I know Russians who answered the question "who won WWII?" simply with "Russia". So it's nothing specific for Americans to think they won alone.

And the Cold War also played a role. I think during the Cold War, very few people in the West wanted to praise the role of the Soviet Union in the war but thought better not to mention that they had ever been an ally.

2007-09-10 14:04:39 · answer #3 · answered by Elly 5 · 0 1

American film makers want to make movies that people will go to see. So playing around with the facts are necessary to make a movie on a subject you fell asleep to in history class. Having said that, The English have never been honest with their history. In my time there, I've never heard them say one negative thing about their history. I remember, while I was there, how pissed off everyone there was when that movie The Patriot came out. Everyone there was accusing Hollywood of rewriting history, and continuing it's tradition of casting The English as villains. Imagine my surprise when I found out that English children never studied The American Revolution. This begs the question, How do you know Hollywood rewrites history, if you don't know the original history?

I'd like to add that Hollywood isn't the only place that makes movies, England does too. several post mentioned the other countries who contributed in the allied victory. Now be honest, if you made a movie in England, would it be about the Russian's march to Berlin, or would you concentrate on the British fight?

2007-09-07 07:23:06 · answer #4 · answered by Anonymous · 0 1

Yeah, the movies spun it to make us look like the conquering heroes. Sure, it's what the American public wanted to see, and it's played a huge role in what the average American perceives as the historical truth. However had we not supplied the manpower, technology and manufacturing capability that we did, we would live in a very different world than we do now. WE did not do it alone, but the other nations would have had a nearly impossible time taking Europe and Asia back. Even if the Soviets had managed to defeat Germany on their own (doubtful), would it have been any better to have ceded Europe to the benevolent embrace of Soviet Communism? Hitler was a pussycat compared to Lenin, Stalin , Beria etc.... America was not solely responsible for victory, but without America, things would have been terribly bleak for Europe and Asia. Anyone who can't or won't acknowledge that is either ignorant or incapable of seeing the truth.

2007-09-07 04:43:08 · answer #5 · answered by Anonymous · 3 1

You need to remember that films made during World War II in America (as in other countries) were often used as Propaganda tools and therefore cannot be taken at face value.

From what I recall seeing in American film, most attention is paid to the heroics, and very little to the politics of the situation. Take the Great Escape or Stalag 17, for example. They were made long after the war was over but what do they really tell you about the political situation? There are crumbs of truth, but a lot of leeway is given to the artist to make it entertaining and engaging.

Note: Other nations also put their own "local" spin on world events in their films.

2007-09-07 04:48:23 · answer #6 · answered by Runa 7 · 1 1

I dont look at movies for WW2 history, the movies are just entertainment. If I want real WW2 fact, I will watch the History Channel, or the Military Channel. They tell all sides of the battle, and the war. I actually like that much better, I dont like hearing the same big win battles that america had, such as omaha beach. I like hearing the ones that dont get told in the movies, like the battle of britain, operation market garden, el alamein, monte casino. Those are way more interesting then the same ones we hear over and over again. Although all the battles are good to remember.

Hollywood concentrates solely on American effort simply because...Hollywood is selling its products to...America, it just happens that the rest of the world wants to see the things Hollywood makes.

2007-09-07 04:37:27 · answer #7 · answered by applebeer 5 · 3 1

Everyone has mentioned films made after the war.
There were films made during the war to let the American public know what was going on. They were for the most part propaganda films.
One that stick in my mind is the Frank Capra series "Why We Fight". It tells the American people what was going on in Europe and the Pacific of places they had never heard of.
Back in 1942 when the movie "Casablanca" was released most Americans did not know where Africa was much French Maraco or Casablanca. Which they changed the name of the movie from "Everyone Goes to Rick's" to "Casablanca" after we went ashore at Casablanca.

2007-09-09 04:49:22 · answer #8 · answered by Tin Can Sailor 7 · 0 1

It's pretty rare to see a Hollywood film about other nations at war.
The US didn't get into the fight full on until 1942 so the war had been raging for a while. The studios make movies that people will see. That's why you won't likely see big budget movies about Canadians on Juno Beach D-Day, the British in Hong Kong, etc.

2007-09-07 04:34:25 · answer #9 · answered by Anonymous · 2 1

I dont see a point why AMERICAN movie makers shoud be concerned of the lets say Phillipino war effort against Japanese. THe EUrope has its own well developed movie industry. it can deliver its own heroic eposes.

btw. can you name some of the movies where are Americans fighting battles in which they were not involved in reality?

The movie industry produces the demanded stock, it has to make money, not educate. Educational channels do not broadcast Private Ryan nor We were soldiers. No one goes into the movie to get a slightest piece of education. That's it. if you don't like it, do not watch it.

Our local movie making industry tends to show how our past communist era was bad, but survivable and actually with a touch of fun. This is mainly beacuse no one would go to the cinema to watch the historically accurate reconstruction of the hard line stalinist oppresion.

to your question, I don't think the american movie makers are twisting the historz, thez are just pointing out the crucial moments attractive to their audience, and those are surprisingly American patriotic soldiers engaging in battles and their patriotic homeland supporters.

2007-09-07 05:06:26 · answer #10 · answered by Anonymous · 1 3

fedest.com, questions and answers