I guess it's important to learn American History so you can understand our country and it's government a little bit more. When I took the class two years ago, I was kind of enlightened.
2006-08-15 14:27:01
·
answer #1
·
answered by glow 6
·
2⤊
0⤋
So you'll have something other than sports, cars, and women (for men), or shoes, purses and guys (women) to talk about at cocktail parties.
If you study REAL history instead of the silly crap that generally gets passed off as history, it's actually entertaining and edifying. How many American history classes do you suppose teach that Jefferson thought a revolution was necessary about every 20 years?
2006-08-15 21:48:55
·
answer #2
·
answered by lenny 7
·
0⤊
0⤋
So one knows why things are the way they are, and we do not repeat the dumb things that have happened.
Schools today don't teach enough of "American" history, and the students don't really pay attention anyways. I think it is important to know the stuff so that things can be improved and evolve in the future. You can't improve something if you don't know the beginning of it, right? The kids now are our future, and they will need to be the ones that grow the government so that it changes with the needs of the time to come.
STAY AWAKE IN HISTORY CLASS!! YOU DO NEED THE INFORMATION!!
2006-08-15 21:34:48
·
answer #3
·
answered by volleyballchick (cowards block) 7
·
1⤊
0⤋
it's important because if you learn american history, you can connect it to the things that are happening right now in the world....when you watch the news, you'll have a whole new perspective of the things that are going on. and also, i believe that it is very important to know the history of the country that you are living in. you learn about it, and it helps you see how much hardwork it took to make this country, you feel a sense of pride and honor that you are living in such a country, like the U.S......it helps with your nationalism!
2006-08-15 21:27:21
·
answer #4
·
answered by Pirates Life for Me! 2
·
3⤊
0⤋
i just took a FASCINATING history class, U.S. 1877 to present. it had so many details, from government corruption to what different minorities were doing at during each decade. it explained how big business started to dominate the country, and how we've never been the same since WWII. It was great, and I am so glad to have learned it. History always repeats itself, so you need to know what DID happen to help with what WILL happen. it's about making informed decisions.
2006-08-15 21:24:23
·
answer #5
·
answered by advicemom 4
·
1⤊
1⤋
Other than learning about your country, it broadens your knowledge base allowing you to understand its development and workings.
I take it you're looking for an arguement to not have to take it in school. Believe it or not but Hollywood makes movies with a lot of historical references (because they know what's taught in school) so you can relate and enjoy it more. I guess you could just get more popcorn during those parts!
2006-08-15 21:46:08
·
answer #6
·
answered by Mike K 3
·
0⤊
0⤋
I just love history. It is so interesting to learn about what our ancestors went threw. And everything they did to win us all the freedoms that we have.
2006-08-15 21:28:51
·
answer #7
·
answered by Teslajuliet 4
·
2⤊
0⤋
so you will know what it was like back then to get you ready for what is to come becuase at the rate we are going history is about to repeat itself real soon
2006-08-15 21:47:32
·
answer #8
·
answered by Richard l 1
·
0⤊
0⤋
It's for you to realize what a great country you live in and the freedoms that you have, that other people don't have and the sacrifices that people made, so you could enjoy those freedoms.
2006-08-15 21:37:28
·
answer #9
·
answered by Vagabond5879 7
·
1⤊
0⤋
Those who do not learn from history are doomed to repeat it.
2006-08-15 21:24:22
·
answer #10
·
answered by Brother Mutt 2
·
2⤊
1⤋