English Deutsch Français Italiano Español Português 繁體中文 Bahasa Indonesia Tiếng Việt ภาษาไทย
All categories

What do you believe is the most important event in US history?
Why?
this is a homework question, yes, but I'm trying to come with with an important event that I, a simple answering male, can answer without having to sprinkle and throw in a bunch of bullcrap in it to make it long (like your apparently supposed to do in english essasy) so I'm getting opinions. .
Thanks!i!

2007-09-06 14:59:21 · 15 answers · asked by Anonymous in Arts & Humanities History

15 answers

You have asked a very tough question my friend. There are plenty of things happening currently in the U.S. right now. Specially that the anniversary of 9/11 is upon us again.
Would it be right for you to choose the Tuesday morning of September 11, 2001 as the biggest tragedy so far in our history. That in an instant a super power like the u.s was on it's knees and people became united towards a common foe.
And true heroes came out in that tragedy.
Or would you choose how this nation became one nation under God. When a handful of colonies banded together and rebel against the crown of England against King George. And thus formed their own constitution. Our own bill of rights which we still practice to this day. Not only that our forefathers made a stand against a dictator the king but heroes emerges like Washington, Franklin, Jefferson, Patrick Henry and many more.
A nation that started from 13 state and now is 50. So it's up to you to choose.

2007-09-06 15:24:58 · answer #1 · answered by darkvadershield35 2 · 0 2

Definitely the American Revolution, accompanied by the Declaration of Independence. The colonists were being taxed by the British (because the British had been protecting them from the Native Americans and had just finished a war and needed money)...without representation. In other words, the people in Parliament made the laws, and the Americans had no say whatsoever. That, and other major things changed the Americans' view of the British (they used to be proud to be British). Eventually, the American Revolution began, and England and the colonies fought, and we won. =] It is definitely the most important event in U.S. History because a) it proved to not only Britain but to the rest of the world that it's necessary to break away from a tyranny (and that it's possible) and b) we probably would not be here today if we never chose to split apart. Of course, it was necessary for us to eventually split from the British...everyone knew it was coming. Hope that helps =]

2007-09-07 01:19:21 · answer #2 · answered by love&&life 3 · 0 0

I'm going to have to borrow from several authors, and venture to say it was the Civil War. The war took us from a bunch of individual, self absorbed states and turned us into a united country. It pitted brothers against each other, and resulted in casualties of right around 626,000.

Further, it ended slavery as an accepted institution. That is not to say that everything was rosie, because it wasn't. However, the Civil War was the turning point in this nation's history. It also brought about some amazing technological advances, as wars always seem to do, and advanced medicine and individual rights of all Americans.

2007-09-06 22:52:46 · answer #3 · answered by Super Lawyer 2 · 0 0

Well, it would have to be one of two things.
The first would be when the pilgrims came and settled or Jamestown because that would be the beginning of the US History, coming to America and settling.
The most defining thing though would have to be the American Revolution, without that we wouldn't have the US, we would just be part of the UK.

Probably your best bet would by the American Revolution there is a lot of known facts and that really is the beginning of the US history.

2007-09-06 22:05:55 · answer #4 · answered by Jackie Oh! 7 · 1 0

The Revolution or the First Continental Congress are two important times in US history. Then the Civil War was another important era. Then, we were attacked on our own soil on 9/11/01. That was the first attack on American soil since World War II when San Francisco was shelled.

2007-09-06 22:15:53 · answer #5 · answered by mesquiteskeetr 6 · 0 0

The Industrial Revolution because it transformed the US into a modern country and lead to many important things such as an increase in immigration, the creation of cities, the rise of big buisness, and the reshaping of the US economy. It also eventually lead the creation of FDA and other regulators.

2007-09-06 23:07:10 · answer #6 · answered by LG 2 · 0 0

There are lot of events that define Amercia, some obviously very important such as the Revoltionary War, and some not so apparent, such as how the Vietnam war changed the country. until then the country was rather innocent, you did not bad mouth the government (AKA- protests and riots) you volunteered for military service not spit on the soldiers returning from war, you did not hear that much of "Campus unrest" and so on.

2007-09-06 22:11:46 · answer #7 · answered by Anonymous · 1 0

The Virginia company willing to put out the money to fund a failing settlement.

Or discoverying tobacco, the cash crop and back bone of the united states economy for the next 100 years

2007-09-06 22:07:31 · answer #8 · answered by devinthedragon 5 · 1 0

The most important event in US history is dropping the A-bombs on Japan. The entire human race is threatened by this act. Not only humans but the entire planet. This point is significant enough. We opened the door for the rest of the world to defend themselves and now the world is at the brink of destruction.

2007-09-06 22:06:28 · answer #9 · answered by Relentless 1 · 1 2

The entry to World War 2. It changed their foreign policies and how they view the world.

2007-09-06 22:28:14 · answer #10 · answered by Street Smart 4 · 0 0

fedest.com, questions and answers