English Deutsch Français Italiano Español Português 繁體中文 Bahasa Indonesia Tiếng Việt ภาษาไทย
All categories

that since ww2 America has never NEVER won a war

2006-10-11 04:31:14 · 9 answers · asked by Anonymous in Politics & Government Military

that is for peanut

2006-10-11 04:31:41 · update #1

ps.....all of you saying desert storm......if us won that war y was it necessary to re attack it............pls motivate your answers

2006-10-11 04:46:56 · update #2

9 answers

The US government has not won a war. The Soldiers and Marines haven't lost one. The Korean war was stopped by negotiations. The N Korean/Chinese had almost 5 times as many casualties as the US/UN. In Vietnam the US military won every major offensive. Again Politicians. Panama, Ouch it was terrible to be against the Seals, Rangers, 82nd and 7th Inf. Gulf War. The military slaughtered the emeny, once again politicians stopped it. Granada? again ouch! Bosnia. Kosovo, America has the most powerful military in the world. Second to none.

2006-10-11 05:38:34 · answer #1 · answered by Anonymous · 0 1

There hasn't been a MAJOR world war since WW2 so no obviously we haven't won a major large scale war since then because we haven't been involved in one. Korea could be considered a war but it wasn't just the U.S involved and it was more of a policing action because of communism. Same with Vietnam war was NEVER declared by the U.S. It again was a policing action. Iraqi The gulf war could again be seen as a policing action. Since a country was invaded and we went in to help them (Kuwait) because it was invaded by the dictator of Iraqi. up to the new wars on Afghanistan and Iraqi. We declared war on the Taliban not Afghanistan. We LIBERATED Iraqi because of a bad dictator and he supposedly was a threat to the U.S. There have been no major WARS since WW2 and if you want to consider the Cold War we technically won that but again it wasn't really a war. So America hasn't been able to win any real wars i think you are defining war as a battle or military engagement. A war is two country's openly declaring war on each other which has not really happened since the good old days of WW2.

2006-10-13 19:47:52 · answer #2 · answered by WW2 2 · 0 0

Did you know that since WW 2 the media has had it's nose in the middle of every war showing the US citizens the death and destruction and doing whatever it can to under mind any war effort. If there had been media coverage during WW2 like we see today we might not have won that one because we would have anti-war yahoos succeeding then like we do today.

2006-10-11 11:36:11 · answer #3 · answered by jasonzbtzl 4 · 2 0

We won the first gulf war, which is to say we had realistic and defined goals in that war and they were accomplished.

We won the cold war. Supposedly.

Do you realize we have now be in Iraq longer than we were in WWII? Now that's a sad fact.

2006-10-11 11:35:37 · answer #4 · answered by Fire_God_69 5 · 2 0

Stop smoking crack bud. What constitutes victory in your eyes? Is it a huge trophy? Vietnam wasn't a lose. If you Dem's would have stood behind our troops instead of calling them "baby killers" we would have had a clear cut victory. We have fought many battles since WW2 and destroyed many of our enemies. If you hate this country so much you should move to France.

2006-10-11 11:39:59 · answer #5 · answered by only p 6 · 0 0

Actually, since WWII, The US has never declared war.

2006-10-11 11:35:05 · answer #6 · answered by October 7 · 0 0

What are you talking about we beat the junk out of greneda.

2006-10-11 11:33:15 · answer #7 · answered by region50 6 · 1 0

grenada, desert storm

2006-10-11 11:43:16 · answer #8 · answered by Anonymous · 0 0

THAT"S A DAMN LIE, WE KICKED KOREA'S *** AND I'LL KICK YOUR'S YOU HIPPY.

also, i believe george lucas single-handedly defeated the galactic empire, and HE'S an american.

2006-10-11 11:39:22 · answer #9 · answered by Matt the Allknowing 1 · 0 0

fedest.com, questions and answers