English Deutsch Français Italiano Español Português 繁體中文 Bahasa Indonesia Tiếng Việt ภาษาไทย
All categories

I doubt there are many WW2 era Americans surfing the web, but you never know. Besides, I might find a history buff.

When I took History throughout my school years, I always thought that one of the reasons America declared war on Germany was because of the Concentration Camps and Hitler's Genocide campaign.

When I watched the "Band of Brothers", it seems that the concentration camps were discovered only after Germany was invaded by the Allies toward the end of the war.

My question is, in 1941 and 1942, what was America's primary reason for declaring war on Germany? Was it because of the atrocities against the Jewish and other people or was it simply because Germany invaded Poland, France, etc...

2007-01-12 15:32:02 · 11 answers · asked by Slider728 6 in Arts & Humanities History

11 answers

Nah, the death camps and such only became a strategic issue when they began to be factored into war plans for conquering Germany. There has been quite a bit of debate about whether the Allies should have bombed known camps like Auschwitz and others within range, especially since those were actually quite well known by Allied intelligence services. The argument was that, while a significant number of civilians inside would be killed, it would put the camps out of commission permimently and possibly prevent a bulk of the further holocaust.

In all reality, the concentration camps were a huge waste of resources for Nazi Germany and the Allies realized this, reserving air asset for naval yards, airfields, ammo dumps, fuel depots, etc. all strategic assets. Actually, given the inproportionately high casualty rate of wehrmaht officers (esp. general officers), an officer was safer serving in a camp than even many of the rear line units on the fronts.

But yeah, WW2 mostly came down to a world wide interruption in the military industrial complex. The US had security of North/South America, Imperial Japan had control over East Asia and parts of China, the Soviet Union was doing its thing, and Nazi Germany had security of Europe, so all the world's major players were in their place. When Japan attacked the US and dragged Germany into a war with the US, Germany and Japan made business bad for everyone in their respective spheres of influence. The United States and Russia had sudden and irrefusable chances to truly expand their influence and knock off two rivals and did so.

Wars and militaries are just extensions of a country's foreign policy and have always been that way. With war comes business and someone standing in line to make it!

2007-01-12 18:38:19 · answer #1 · answered by Hotwad 980 3 · 0 0

America was bound to be sucked into the war eventually. The country didn't want to get involved with "Europe's War" so soon after WWI. We were aiding Britain and other Allied countries for years before, which provoked the ire of Germany. Hitler also had plans, and was in the very early stages of planning an invasion of America during the war as well.

When Japan attacked Pearl Harbor that gave the US a reason to enter the war fully and declared war on Japan soon after Britain did. Germany declared war on the US after we declared it on Japan and we entered the fight against Germany as well.

Knowledge of the camps did not reach most GIs though politicians and generals had heard information, and even seen spy photos of the camps. Most did not believe it until proof was seen. We did not enter the war because of the camps, that's very false. In fact, many Jews were denied asylum despite the knowledge of what they were going through in Europe.

Even had Pearl Harbor not happened we would have entered it eventually.

2007-01-12 18:24:12 · answer #2 · answered by Bleaarg 3 · 0 0

No it was the bombing of Pearl Harbor Dec 7 1941.
And Roosevelt declared war, we were silently in it but not declared yet until we were attacked. The Jews was not an issue or the Polish, and really no one cared and this was a German/American country then or should I say white on white bread. The huns ran things here and all their little associates that were content with taking the crumbs from the table of the kings and queens, like the sicilians, and Irish, British ruled here too and the French as well as Norweigens, Dutch, Swedes, etc., I am not prejudice just telling you fact. Look at the history. www.history.com Look what they did to the Natives?
Attilla the Hun would have been more merciful.

2007-01-12 17:01:11 · answer #3 · answered by Anonymous · 0 0

America never declared war on Germany ,it was the other way around. Two days after the Japanese attack on Pearl Harbour in December 1941 Hitler declared war on the USA.
America was trying to stay out of the war in Europe but trading with both sides.

2007-01-12 17:57:51 · answer #4 · answered by brainstorm 7 · 1 0

Germany was allied with Japan. After Pearl Harbor, the U.S. declared war on Japan, thus leading to war with Germany.

2007-01-12 16:23:14 · answer #5 · answered by Anonymous · 1 0

Technically it was because Germany declared war on America right after Pearl Harbor. However the US had been helping the allies in the form of equipment. Without getting too complex basically we were helping our allies.

2007-01-12 15:40:42 · answer #6 · answered by QandA 2 · 5 0

The United States entered the WW2 by declaring war on Japan December 8, 1941 a day after the attack on Pearl Harbor December 7, 1941 - Germany declared war on the United States December 11, 1941 due to the fact Japan was allied with Germany. Roosevelt had no choice but to declare war on Germany.

"Now it is impossible for us to lose the war!" he said, wildly exaggerating the strength of Imperial Japan, which, along with Mussolini’s Italy, was his partner in the Axis alliance. "We now have an ally," claimed Hitler, "who has never been vanquished in 3,000 years!"

The German dictator personally congratulated the Japanese ambassador to Berlin for his country’s success in catching US forces off guard: "You gave the right declaration of war! This method is the only proper one!"

2007-01-12 15:52:05 · answer #7 · answered by Akkita 6 · 2 2

Because we first declared war on Japan for bombing Pearl Harbor and Japan was allied with Germany.

Yep, I'm too young for WWII but my father was with the Counter Intelligence Corp. in Europe.

Trivia bit: Churchill was related, distantly, to President Roosevelt (8th cousins once removed, descent from Mayflower pilgrim John Cooke).

2007-01-12 15:36:29 · answer #8 · answered by WindWalker10 5 · 4 0

the primary reason the US declared war on Germany was because of German submarines sinking US ships taking supplies to England and because the English pushed hard for the US to enter the war. Stories of atrocities were told by people leaving Europe but few Americans believed them, including Roosevelt. Not until the camps were found did the truth come out.

2007-01-12 15:47:39 · answer #9 · answered by oldguy 6 · 2 3

America got involved when Pearl Harbor was bombed by Japan. No it wasn't for humane reasons at all.

2007-01-12 15:58:43 · answer #10 · answered by Snowflake 7 · 2 0

fedest.com, questions and answers