That is a pretty loaded question.
It's probably a question better asked to Google, or wikipedia. That actually sounds like an essay question.
2007-10-06 14:09:01
·
answer #1
·
answered by Easy Mohinder 3
·
0⤊
1⤋
At its birth the USA was very naive and longed to be able to keep out of all European wars and world affairs. This was additionally confirmed with president´s Monroe declaration "America for Americans".
Nevertheless real life finally opened America´s eyes. US had to intervene in World War I (1914-1918)
And again when the world was about to be controlled by three cruel dictators, president Roosevelt after imposing sanctions against Japan for invading China where the US had commercial and economic interests, and also passing the "Lend Lease Act", so to help specially Britain from the brutal German attack.
Although declaring an arguable neutrality the US was viciously attacked by Japan at Pearl Harbor and on the eleventh of the same month, Germany and Italy declared war on the US.
This shocking experience has dramatically changed US´s
view on foreign policy. Now you are aware you do not exist in a glass bubble.
Being the N° 1 world power you have great interests to protect all over the world to sustain and keep "the American way of life"and the only way to do it, is to actively participate in world affairs.
2007-10-06 22:20:52
·
answer #2
·
answered by Anonymous
·
0⤊
0⤋
Given that the consequences of World War I were so horrific and that so many American men had died for what many believed for nothing, Americans were overwhelmingly opposed to ever involving our nation in another foreign war. The United States developed a return to an isolation policy, believing that if they stayed out of the politics of other countries they would be safe, and the foreign policy was limited more to business. An example that supports this is --During the 1940 presidential race, President Franklin Roosevelt affirmed to the American people his personal opposition to U.S. involvement in foreign wars. “I’ve said this before, but I shall say it again and again and again:
World War I was the war to end all wars.
Your boys are not going to be sent into any foreign wars.”
The official policy was to ignore the make-up of government structure--a tolerance policy as long as business was good.
A kind of colonization policy.
Their was a major effort to punish and exploit the former enemy, which some believe helped bring Hitler to power.
2007-10-06 21:10:09
·
answer #3
·
answered by Anonymous
·
0⤊
1⤋
It changed everything. That was the War when America became a super power. They did this by showing their might and also dictating policy in the world. before they entered the war our military didn't have much respect, they use to call our troops doughboys because they thought they were soft and fat. We went in and changed the world and our perception in the world has been as a leader ever since and when you show you can dominate people look at you different and you lead instead of follow.
2007-10-06 21:10:10
·
answer #4
·
answered by Domino 4
·
0⤊
0⤋
Before WW1 the US practiced more 'isolationism' (they just kept to themselves and worried about their own issues), while after WW!, the US was more interested and involved in world affairs. (including being instrumental in starting the League of Nations that went on to be the United Nations)
2007-10-06 21:09:41
·
answer #5
·
answered by megalomaniac 7
·
0⤊
0⤋
USA became a part of the world politics, instead of isolated and apart from world politics
Also, England became an ally, instead of an opponent
2007-10-06 21:12:40
·
answer #6
·
answered by million$gon 7
·
0⤊
0⤋
Don't worry.
life is time it's spend by time.
about world war?
don't worry it will not yet come.
our life time is 100years,how much time you will sleep?
up 12years you does not know any thing. after 12 these problems arise.
all the best.
2007-10-06 21:19:45
·
answer #7
·
answered by Anonymous
·
0⤊
0⤋