I live in Tennessee and, of course, the midterm elections are coming up. One of the candidates, a conservative Republican, keeps mentioning that the Democratic candidate is a liberal (he is more of a moderate), as if that is a bad thing. I think one of the worst things that has ever has happened to liberalism is that the liberals have become more moderate or don't call themselves liberals for fear of being slandered. Those who are liberals should be proud to call themselves liberals. Why? We liberals are and believe in:
Clean Air and Water.
Civil Rights.
Free Speech.
Pro-choice.
Pro-Environment
Progressive Taxation
Separation of Church and State.
Social Security.
Universal Healthcare.
Workers' Rights.
These are just a few issues, and having these ideas is in no way bad.
But, anyways, do you think that liberal has become a dirty word? Please give me your thoughts.
Thanks.
2006-09-29
09:45:53
·
22 answers
·
asked by
Anonymous