English Deutsch Français Italiano Español Português 繁體中文 Bahasa Indonesia Tiếng Việt ภาษาไทย
All categories

What I'm looking for is some of America's most important promises that it made as it emerged victorious from World War II. I'm also wanting to know if America was able to keep those promises even years after the war was over.

2007-08-31 15:06:25 · 9 answers · asked by Anonymous in Arts & Humanities History

9 answers

I don't remember if the US made the promise of to help rebuild Europe, but through the Marshall Plan, the US did just that. The Marshall Plan helped Western Europe rebuild its economy and infrastructure. Here is website that might help you better understand.

http://usinfo.state.gov/usa/infousa/facts/democrac/57.htm

2007-08-31 15:15:42 · answer #1 · answered by kepjr100 7 · 0 2

One promise that the US somewhat failed upon was Article 3 of the Atlantic Charter: All peoples had a right to self-determination.

Right after WW2, the Vietnamese (living in what was then known as French Indochina) were looking for independence from France. The French were determined to regain their colony, and the US was determined to support their 'ally' France. We all know what a disaster that ended up to be.

The Dutch were also determined to reassert their control over their colony in the Dutch East Indies. The US again failed to act decisively in favor of independence and it took a few more years of insurrectionist war before the Dutch withdrew.

2007-08-31 18:09:42 · answer #2 · answered by Ice 6 · 0 2

Well, after WW2, the UN was founded to keep peace among nations, but you can see how that one turned out. ): f. America is about as good at peacekeeping as the local playground bully. I love my country and all, but sometimes these politicians and diplomats are just so frickin stupid. America always has to be the big brother, but i think we should keep our big noses out of other people's business and worry more about the problems we've got here

2007-09-07 16:28:20 · answer #3 · answered by ? 1 · 0 2

Yes, we were the Allies and we were victorious. After WWI Germany was left demoralized and defeated. After WWII the Allies learned that leaving Germany like that started WWII. So After WWII ended the allies got both Germany and Japan back on their feet economically and and industrially.

Like Manufacturing Cars.

It used to be American cars really dominated but not anymore.

2007-09-07 15:45:50 · answer #4 · answered by Will 4 · 0 2

Well, I am thinking of the rebuilding of war torn Europe.
Also helping Japan regain its industry to the point they are showing us how to make cars with what we taught them.
We kept our promise to Isrial to help pave to the way to making them a nation again 1948.
Im sure there are many more.

2007-08-31 15:16:31 · answer #5 · answered by Terry 2 · 0 2

We promised that one one day you can go on the Internet and say whatever you want and not get your head chopped off. That's not good enough for some people though.

2007-09-05 16:38:47 · answer #6 · answered by LIMBAUGH 08' 2 · 0 2

I didn't know "America" made any promises. What exactly are you talking about?

2007-08-31 15:10:26 · answer #7 · answered by Anonymous · 0 2

Treaties are meant to be broken - just ask the Native Americans - we promise all kinds of crap, get them to comply - invade, kill, and steal land. It doesn't matter what we promise, we will take it in the end.

2007-08-31 16:11:58 · answer #8 · answered by franktowers 2 · 1 4

What are you talking about?

2007-09-08 11:55:22 · answer #9 · answered by Anonymous · 1 0

fedest.com, questions and answers