I find it very interesting that back during the Civil War era, Democrats were mostly and almost exculsively rich, white men from the south. Big time supporters of big business (cotton, tabacco) exploited the working class (most notably slavery) and Republicans were at least against slavery...and some supported equal rights for everyone and such. It used to be the democrats were conservative and the republicans were the liberals...when and why did this all change?
2007-02-25
05:01:48
·
6 answers
·
asked by
Anonymous
in
Politics & Government
➔ Politics