From conversations I've had, and posts I've read here, it seems that schools in the south teach that the war was called "The War Between the States," and that it had nothing to do with slavery, but was actually about states' rights. Now anyone who has studied even a little about the war and what led to it knows that the only "states right" that the south was worried about was the right to own slaves, so pretending this wasn't about slavery seems to be an attempt to whitewash the past. Can anyone who goes to school in the south, or used to, tell me if this is really still happening, and what you think about it? I'm honestly curious.
2007-12-17
04:27:56
·
13 answers
·
asked by
Candy
5
in
History