English Deutsch Français Italiano Español Português 繁體中文 Bahasa Indonesia Tiếng Việt ภาษาไทย
All categories

just heard this on the telly & have heard it in the past. Im not sure whether its meant to be religious or something to do with animals?!?!?!?!

2006-08-11 21:35:20 · 4 answers · asked by Mrs D 6 in Society & Culture Other - Society & Culture

4 answers

"How the west was won" typically refers to the call for manifest destiny in the United States and ensuing wars/genocide of the native peoples of the United States. The Indian wars have long since been over, unless you include Indian Casino's as a last ditch effort to take revenge on the white man.

2006-08-11 21:45:14 · answer #1 · answered by wackywallwalker 5 · 2 1

are you talking about the film ,how the west was won, it was a film
starring john wayne,spencer tracy did the narration the story line was of pioneering , from the east to the west.

2006-08-11 21:40:45 · answer #2 · answered by lefang 5 · 0 0

Are you serious??

The West was never "won"..sadly enough it was "taken" from the noble native American nations and tribes.

The West was lost...at, "Wounded Knee" and along the "Trail of Tears".

2006-08-11 21:41:41 · answer #3 · answered by B'klyn Barracuda 3 · 1 0

wha?

zeppelin.

2006-08-11 21:38:02 · answer #4 · answered by Abby 3 · 0 0

fedest.com, questions and answers