I was flipping through channels the other day and happened to come upon someone who was interviewing Jerry Springer, who was stating that the elections showed that our country is becoming more and more liberal.
Do you think this is true. Consider the fact that the only way that democrats achieved the major wins they needed this election was with pro-life, pro-gun, anti-gay marriage candidates- not exactly the standard for liberal policies. Couple that with the fact that the Republicans were caught up in an unpopular war and include the fact that people were fed up with curruption in the Republican party. Can you really say that the nation got more liberal OR did the Democrats get more conservative to capitalize on discontent in the conservative base? The Republican victory in '04 was about values and this year corruption- lack of values, same. Also consider the fact that congress is still made up of a conservative majority how much of Pelosi's leftist agenda will she get through?
2006-11-15
04:02:27
·
14 answers
·
asked by
chuck3011
3
in
Politics