English Deutsch Français Italiano Español Português 繁體中文 Bahasa Indonesia Tiếng Việt ภาษาไทย
All categories

The wild,wild west era was when there was cowboys and shooting going on, but the slavery era was with the African slaves and the Europeans and their other generations who still had that mentallity and still had slavery go on for years and years. I know in fact that it even hgoes on today. So, what do you guys out there think?

2007-01-20 10:08:08 · 12 answers · asked by Jennifergir 1 in Arts & Humanities History

12 answers

the wild west and slavery was going on at the same time,,, with slavery beggining first,,,,,,,

2007-01-20 10:12:22 · answer #1 · answered by dlin333 7 · 4 0

It was after the Civil War era... people shooting each other were renegade soldiers who fought in the war and had the mentality that they could continue their behavior in civilian life.

Slavery was so long ago. The Dutch started it before the Union of the first 13 states. It's as old as the European settlers themselves.

The Irish and Scottish were also sold as slaves along side the Africans. Even though it was less frequent.

If you study history throughout time, even white people, The red haired, freckle faced and blond were slaves to the Romans. They were the Huns, or Nordic people who had no rights and were sold as slaves for much longer than slavery in the Americas.

Don't forget the Chinese, they were slaves, too. Especially when they were building the railroads.

Take a look at Australia, get the movie called "Rabbit Fence" to understand what the natives went through, too.

History repeats itself no matter what race, it's all about how one society can suppress another for what they want.

2007-01-20 10:20:25 · answer #2 · answered by Anonymous · 2 0

Well slavery was for thousands of years up until the 1860's in the US. The wild west you see in the movies depicted an era after the Civil war until about 1900. so, the WILD west was more after US slavery than before or during.

2007-01-20 10:12:09 · answer #3 · answered by Yahoo Answer Rat 5 · 1 0

What most people don't realize( Our real history is not taught in school, its taught that all we have contributed is that we were brought over as slaves then freed...so much more to this story.) is that not all blacks in America were slaves, there were many black man who had established businesses, there were even black men who owned slaves. There were also black cowboys, I would highly recommend that you see the movie "posse" directed by mario van peebles. A very insightful nad eye opening movie. And yes there are people who have a slave mentality, but only because they choose to, the knowledge about ourselves as a people, our history is available, but this generation would rather have escalades and bling bling instead of realizing the true power that we have as a people. Most people don't even know or realize that civilization came from Egypt, if you do your research you will find that modern scholars do not dispute this. And remember Egypt was a land of black people(Africa). But anyway, I've strayed from the answer. If you like you can email me with black history questions, my boyfriend and I are history buffs. I agree with Laura, about other ethnic groups being slaves. So true. Also did you know that Whites were slaves first. Read those ancient history books girl, all your questions will be answered and you will be truly amazed at what you will find.

2007-01-20 10:46:19 · answer #4 · answered by SexiTash 2 · 3 0

Slaves were brought here before cowboys arrived. And slavery has existed for thousands of years, Americans and Africans did not have a corner on the market of slavery. We seem to have forgotten that many white people arrived here as indentured slaves, including my own ancestors.

2007-01-20 10:13:23 · answer #5 · answered by LoneStarLou 5 · 4 0

The Wild West began truly with the explorations of the Lewis and Clark Expediton of 1802. Even before those two Army captains and the Shoshone woman Sacajawea began their trek westward, many men lived to hunt and trap, trading their pelts to the Indians. These men, truly wild and formidable, were called the Mountain Men and it was they who eventually turned away from a non-profiting fur business into leaders, trackers, and scouts of the immigration parties.

2007-01-20 10:22:03 · answer #6 · answered by Guitarpicker 7 · 1 0

I think the wild west idea started before the Civil war, but the "oregon trail-ism" of wanting to move out west continued into the 1890s

2007-01-20 10:16:46 · answer #7 · answered by heart_attack_2006 2 · 0 0

Angel, Make a point of viewing "Dances With Wolves". You'll get a pretty good idea of what your grand, & greatgrand fathers destroyed. (Actually, the north American 'wild west' was not a patch on Australia's wild N.S.W., -back during the 1850's to 1890's. Look it up).. .

2016-05-24 02:00:56 · answer #8 · answered by Anonymous · 0 0

The "wild west" lasted only 40 yrs after the civil war. It was marked by a great deal of "yellow" journalism and mythic hogwash.

2007-01-20 10:12:44 · answer #9 · answered by Sophist 7 · 0 0

I am pretty sure that the western migration was after the slavery period. It is funny that you should ask because the term COWBOY was actually a white term given to the black cow hands.

2007-01-20 10:14:00 · answer #10 · answered by Anonymous · 2 1

fedest.com, questions and answers