English Deutsch Français Italiano Español Português 繁體中文 Bahasa Indonesia Tiếng Việt ภาษาไทย
All categories

By the 1920s, Americas believed that World War I had been fought by the US mainly for what cause in particular?

To make political alliances with emerging African nations

To create profits for American business and manufacturing companies (especially weapons and related war supplies)

To get Teddy Roosevelt re-elected

To avoid loosing England as an ally

2007-07-14 11:34:21 · 4 answers · asked by Anonymous in Arts & Humanities History

4 answers

There are some Liberals that would say that it would be the second one; to create profits for American business and manufacturing companies.

2007-07-14 11:45:02 · answer #1 · answered by kepjr100 7 · 1 0

There are some conservatives who would say that America was a charitable, altruistic nation that would never have stooped to profiting from war.

But people were more realistic in the 1920s... and the facts (and company balance sheets) speak for themselves.

I'm not being judgmental... I mean America didn't MAKE the Europeans wizz their fortunes and empires away in futile war... but it's so laughable that people on a certain side of politics find the bare facts so difficult to accept.

2007-07-15 16:45:35 · answer #2 · answered by llordlloyd 6 · 0 0

None of the above - - - if issues were that simple there would be no wars - - - - but if some idiot assigned you this question then they are expecting "to create profits etc" but beaware of False Prohpets/Proffits.

Pax----------------------

2007-07-14 18:39:18 · answer #3 · answered by JVHawai'i 7 · 0 0

US did not be part of the world war1 because there was a civil war in US. so it can be second one...

2007-07-14 18:58:43 · answer #4 · answered by Sesi 1 · 0 1

fedest.com, questions and answers