English Deutsch Français Italiano Español Português 繁體中文 Bahasa Indonesia Tiếng Việt ภาษาไทย
All categories

A)Revolutioary warB)Civil warC)W.W.1D)W.W.2E)Korean F)VietnamG)the war on terror.

2007-06-29 04:24:32 · 20 answers · asked by Anonymous in Arts & Humanities History

20 answers

Importance is a fairly broad term, given that it can be applied to different arenas that are difficult to compare. For example, the Revolutionary War was important in that it was the official beginning of the U.S., but any major event in the past is a factor in future wars, and the Revolutionary War happens to be first among this list. (By extention, the French & Indian War of the 1750's, while in colonial times, could be evoked by the same logic.)

The Civil War was important as it established the U.S. as a unified nation. A couple of points worth noting: before the Civil War, the name "United States" was plural, but afterword, it became a singular collective noun. Also, when Abraham Lincoln gave his famous Gettysburg Address, he intentionally used the phrase "nation" to refer to the whole, which was a bold move for the time.

World War I was not so much important to U.S. history as it was to Europe. Still, a new sense of isolationism and a national paranoia over communism resulted. I tend to consider that minor.

World War II was definitely the most important war of 20th century U.S. history. It turned the U.S. into the world power (along with the Soviet Union) after Europe decided to downgrade its global status. The wars after that (Korea, Viet Nam, Persian Gulf, Iraq) are all resultant of the U.S. status since WWII, and were at most hiccups in the framing of the American experience.

So I'll say the Revolutionary War, Civil War, and WWII form a three-way tie. However, I proffer one not on the list. The War of 1812, generally not thought of, is a very important war in U.S. history. The results of that war led to the U.S. becoming the only real power in the Western Hemisphere as it allowed the country to not be caught up in European affairs. Another result was that it allowed the U.S. to expand westward to the Pacific, given that the British withdrew from arming Native Americans in what is now the Midwest in exchange for Canada's safety from U.S. invasion. The War with Mexico is a direct result of this, and the Civil War is a child of this as well. Perhaps the most telling aspect of the War of 1812 is that before the war, an American was a First Nations individual, while whites were generally considered something else. After that war, the term American applied to the whole of the U.S., thus our true national identity was born.

2007-06-29 06:25:58 · answer #1 · answered by Ѕємι~Мαđ ŠçїєŋŧιѕТ 6 · 2 2

Heck, that's tough. If we hadn't won the Revolutionary war then we wouldn't even have the country we have today so that's obviously an important one. The Civil War was a MAJOR turning point in America and again if the Union had not suceeded then I have no doubt that America would have fallen into the hands of a Hitler or Stalin. And of course if we hadn't fought back in either of the World Wars who knows what would have happened. I'm gonna take out the Korean and Vietnam wars as being all that important. The war on terror is just as important as any of the World Wars because we cannot back down in fear. If I had to say just one war then I would go with the Revolutionary war.

2007-06-29 05:02:13 · answer #2 · answered by freedomfighter 3 · 1 2

A. B. C. D. E. F. All are important in U.S. History. Without the Revolutionary War there would not be a U.S. to have any history. The Civil War was important, because it unified the nation (and it was not about slavery, that just came into play to give the war a moral issue to defend. In every war you need a moral issue to defend, gotta have something to explain to the families of the soldiers that got killed). WWI because we gained important experience on what to do and what not to do in WWII, it was also important because we later suffered through the Great Depression because of it. WWII, because we gained true world power status and as a country that should not messed with. Korea and Vietnam both, reflected worldwide fear on the spread of communism. Vietnam was especially important, because America lost face in the eyes of the world and Americans no longer believed in their own country. The War On Terror, because in has somewhat pulled the country together. You left out some small but very important wars such as Spanish-American War, and The War Of 1812. The Spanish-American War was important because it was the only war were Americans were really truly unified in one subject. The War Of 1812 was important because it showed the World that we were strong enough and dedicated enough to rule our own country.

2007-06-29 09:06:39 · answer #3 · answered by Cookie Girl 3 · 0 1

Definitely A, the revolutionary war. If we had lost (and we would have had the British kept at us instead of just washing their hands of us), we would most likely still be part of the British Empire and the Commonwealth much as Canada and Australia.

Everything would have changed. No war of 1812, probably no Mexican War. The US's western border would probably be the Rockies (British would have gotten the LA Purchase after defeating Napoleon). There would have been no or a smaller Civil War. The British Army & Navy, once they completely had outlawed slaves in the south, would have crushed all resistance without the needless loss of life and one dimensional thinking of Grant. We would have entered WWI and WW2 earlier and both wars would have ended earlier.

So, the Revolutionary War was by far the most important 'win' for the United States.

2007-06-29 04:34:51 · answer #4 · answered by IamCount 4 · 1 2

While the Revolutionary War and the Civil War are both important in the shaping of America, I think WW II is the most important. It set the U.S. apart as the leader of the free world. The politics evolving from that war has influenced world events since. The USSR has since fallen, but Israel's formation was a direct result of WW II and has shaped Middle Eastern politics since.

2007-06-29 10:09:03 · answer #5 · answered by jaytee556 3 · 1 0

A: for first because we would not be a country with out it.
B: Civil defined the union staying together ended unjust slavery
C: WWI was just us getting dragged into europes nightmare
D: WWII was the most important in the 2oth century by far as protecting our counrty from a foreign power that had ambitions to rule the western and eastern europe
E: Korea was a cold war action. defined the we would stand up to U.S.S.R. and China
F: Vietnam same but went about it the wrong way outgrowth as a result of WWII , truman wouldn't talk to Ho Chi min at SF when setting up the U.N.
G: Undecided depends on these fanatics who keep attacking innocent people am wainting to see how the western democries are able to react

What about these other wars:
Wars of 1812
Mexican war
Indian wars.
Spanish American war

Without the mexican war we would not have TX, CA, NM, UT, AZ . as states we would be a smaller country.

1812 war estabished that the U.S. has the right to trade on the open seas although the brits attached U.S. because we traded with France.

Indian wars: just seized property and lands of native americans for the white population to move into and exploit
but without the wars we wouldn't have as many cities or states. I wonder how different it would be if the indian nations still controlled vast amounts of land ????

Spanish American war: that how we got involved in WWII because we were a coloniol power by having the Philipinnes in the far east and Japan wanted a sphere of japanese control for raw minerals and oil and we were in the way so were China, U.K and France, and Dutch in Japan I think its referred to the ABCD war and maybe E, and F but not sure
A= americans ,B: Brits, C: china of course, D: dutch ,
E: Europeans in general, F: french

So looking back it may be the spanish american war that was the most significent because it started a chain reaction that lead to most of the wars the U.S. has been in for the last 100 or so years to the present day even the present war on terror can be traced back to the spanish american war, because after WWII U.K. divested all it empire and that hwo we got these dictators and fanatics in control of the middle east. Before WWII the middle east was relatively peaceful for thousands of years Jews, Islamic, Christian all got along except for occasion empire building by the Romans, Greeks, Ottomans.. that disrupted peaceful living..

:

2007-06-29 08:56:28 · answer #6 · answered by Anonymous · 0 1

I would pick B and D

Civil war led US into an united country under solid leadership and became the motive for America to become an industrized country

WW2 destroyed European empires and US became a superpower by not being raped during the war. US was the only advanced country that wasnt terrorized during WW2

2007-06-30 00:17:28 · answer #7 · answered by Anonymous · 0 0

Great question! There's no way we could be the country we are without the revolution or the war between the states. But militarily WW I marked a large shift from a small standing army with militia called up for action as needed to a substantial national military force with the manpower of a unit being drawn from the entire country. The difference between serving in the US 69th division vs. serving in the 23d Pennsylvania is subtle but profound.

2007-06-29 07:46:29 · answer #8 · answered by Anonymous · 1 1

The Civil War. It made us what we are today. As Shelby Foote said." in 1860 if you ask the man on the street to describe the United State they would, most likely, answer the question by saying the United States are.... in 1865 they would have said The United States is... ". That change of the plural to the singular tells it all the United States stopped being as regional in it's thinking and began to become a the nation state that it is today

Our National Identity comes from the Civil War. We believe that our laws must be for everyone no matter your color or creed. Even though we still have a way to go to equality we can all trace that belief of equality to that war.

2007-06-29 05:13:05 · answer #9 · answered by redgriffin728 6 · 0 3

I would definitely say World War Two. Not only because it was the biggest war, but it was the war that ended the era of European control over the world and ushered in the Cold War era, which made the US a superpower.

2007-06-29 06:16:34 · answer #10 · answered by greencoke 5 · 1 0

fedest.com, questions and answers