Why when you say it a white "culture" teaching, there is little or "NO" truth taught in the schools about the Native Americans. Not once in you history class will you hear, well the "Indians" had a right to be angry at the settlers, because they were forcing them off lands that had been their home for thousands of years. You will not hear how the "Indians" were lied to, tricked (easy to do because our Peoples lived by honor and respect), murdered, put bounty on, sold or traded blankets infected on purpose with disease, whiskey that cloud the mind and make sick lives, the women raped, treated as dirt and less, sold as slaves, beaten, and abuse in so many ways. No you will not be taught the "truth" of the "settling" of these lands, or the "taming" of the savage "heathens" who were "godless". No you will not be taught the truth that our Peoples were far more civilized, and advanced, socially, morally, medically, and spiritually because then they have to admit it was a genocide far worse than anything "Hitler" and the "Nazi" ever did!
2007-06-12
10:29:09
·
23 answers
·
asked by
Anonymous
in
Other - Society & Culture