English Deutsch Français Italiano Español Português 繁體中文 Bahasa Indonesia Tiếng Việt ภาษาไทย
All categories

i have to have a 2 page paper on how slavery changed america and it is due tomorrow...any facts or any internet sites any thing would be helpful....Thank you.

2007-09-16 12:10:56 · 5 answers · asked by Heather D 1 in Education & Reference Homework Help

5 answers

Yes, it's in your textbook.

Homework is called homework for a reason.

2007-09-16 12:13:52 · answer #1 · answered by Anonymous · 1 2

www.earlyamerica.com/review
rootsthebook.com
slaveryinamerica.com

This is especially sensitive to me and I would like to encourage you to really read the material you find online and write a good paper. The major change was economical. When the British found that they were unable to continue to use the Native Americans (as slaves) because they died from the diseases that were brought over and they were unable to deal with the heat of the sun. The English in their travels began to spy out other countries and noticed that the continent of Africa had black men and women who not only toiled in the sun, but moved with a certain passion in their work. As the blacks were lured to the ships, many by colorful cloth that was strewn on the ground and led directly into captivity, others came aboard and showed the Kings their firearms and made trade by exchanging these unusual goods for their own people. Cotton in America, in the south was plentiful, but there was no one to work those thousands of acres. Cotton in the south was King. The blacks were separated from their families, unable to continue their religious practice or language, heads of the families were often taken from their wives and children and the children were sold off. Whips and other forms of beatings were designed to keep the slaves intact while the white southern landowners became rich.

There is so much more to the story of how slavery changed America. Not only economically, but socially, politically, civilly.

Good luck.

2007-09-16 12:24:29 · answer #2 · answered by THE SINGER 7 · 0 0

the slave trade developed the cultivation of coffee, cotton and later sugar, esp. in the West Indies, which is technically Central America but you still put that in there

the slaves made up quite a large percentage of the colonial population, esp. in the southern states

slaves provided the labor for Southern growth of cotton and the dependence of the South on the (sad) slave system

it chaned N. America in that it later caused the Civil War...

stuff like that. google it.

2007-09-16 12:17:08 · answer #3 · answered by psgr 3 · 0 0

In a nutshell, it indirectly created the poorest minority in the US, the African Americans. When they were "emancipated", they were let out into society as "free" people, only they couldn't vote or better themselves economically. Though some blacks have managed to get out of this in this age of racial integration, many are still caught in the vicious cycle of economic standstill, which is largely responsible for today's ghettos.

2007-09-16 12:16:01 · answer #4 · answered by Anonymous · 0 0

Tortured a bunch of people and got the US into a civil war.
Other than that I don't know but do remember that it was said they effects were not positive. Any benefit derived from their work was destroyed by the civil war. So it was a lose lose situation for everyone.

2007-09-16 12:15:24 · answer #5 · answered by Anonymous · 1 0

fedest.com, questions and answers