English Deutsch Français Italiano Español Português 繁體中文 Bahasa Indonesia Tiếng Việt ภาษาไทย
All categories

plz answer!!!!! :-)

2007-03-03 04:13:10 · 20 answers · asked by num1soadfan 2 in Society & Culture Other - Society & Culture

20 answers

For the same reason they think they invented everything - arrogance.
Do you know that Alexander Graham Bell was voted one the Greatest Ever Americans ? HE WAS SCOTTISH!!!!!!!!!!!!!
The Americans have only ever won one war on their own - their Civil War !!!

2007-03-03 15:46:51 · answer #1 · answered by Anonymous · 0 0

I always thought the Allies did. However, the U.S. got in at an opportune time to help England, Russia, France,and China. I don't think the Allies could have won with any of the four big powers not participating. It was a group effort, and this is coming from an American.

2007-03-03 04:20:48 · answer #2 · answered by Purdey EP 7 · 2 0

I don't usually like to answer a question with a question but based on the events why would one feel that America was not on the winning side? Granted nobody really wins in war, everyone looses something. However based on the events America did force a treaty from Japan in WWII and our aid of the UK and USSR was pivotal in their being able to stay in the fight. As for WWI the situation is more muddy, but again if not for the aid of the US I do not think that it would have ended the way it had.

2007-03-03 04:22:37 · answer #3 · answered by warderbrad 2 · 1 1

Because both times the enemy surrendered. And in both cases the United States contribution was instrumental to victory. We didn't do it *alone* but without the US, victory would have been much more difficult if not impossible. (Actually, the Pacific campaign against Japan in WWII was almost entirely an American effort.)

2007-03-03 04:25:32 · answer #4 · answered by dukefenton 7 · 0 0

Because History is written by the winners. And we love to think great things about our short historical past.

Short compared to other Countries like Italy and England. Also we need something to hang our hats on. I am sure that no one is going to look back proudly at Vietnam, Somalia, Iraq or Afghanistan.

2007-03-03 04:26:07 · answer #5 · answered by DeJay 3 · 1 0

Because in this society we need to believe the loses were worth the risk. Then again, for any war (where we didn't belong in the first place) a greater wake up call. There are NO WINNERS!

2007-03-03 04:21:35 · answer #6 · answered by Kyle L 1 · 0 2

We don't, we just know we were on the winning sides of both wars.

2007-03-03 04:16:35 · answer #7 · answered by Robert 1 · 2 0

Because Americans can't understand the idea it was the world vs the nazis. They think it was the Americans vs the world full of nazis.

Americans... gotta love them. lol

2007-03-03 04:55:17 · answer #8 · answered by Michelino 4 · 1 1

Because they were on the winning side. Also, American aid to the UK and USSR during WWII was vital in helping them win.

2007-03-03 04:15:55 · answer #9 · answered by Anonymous · 2 0

they do think they won both wars.but when was the last war america won all on its own?

2007-03-03 04:20:31 · answer #10 · answered by peter o 5 · 1 1

fedest.com, questions and answers