I have an honest question not meant to insight flames... I consider myself more of an Independent than anything else... I was always under the assumption that being a Republican meant less government involvement in our lives, and practicing more fiscal responsibility. When did social issues and religion get all entangled into the mix?. Isn't the very essence of "Republican" mean LESS government forcing itself down the peoples' throats? So why all the attempted legislation to control what we did with our bodies, in our bedrooms, and who we choose to love and co-habitate with? Furthermore, isn't the deficit in severely bad shape? Based on these principles, does the Bush administration represent what it truly means to be a Republican? Back up your answers with a little thought please... Republicans, do you feel that your political affiliation has been tarnished by the state of the United States today?
2007-05-26
03:12:28
·
7 answers
·
asked by
EsoMan
2
in
Politics