Someone says the right hates America, someone else says the left hates America. Why can't everybody just put aside their differences and realize we need to work together to survive?
Why do conservatives and liberals fight all the time instead of both just accept the government needs a real change. Not just come election time, do something else to promote peace in the world.
How can we be at peace with the rest of the world if we, as Americans, as one people, cannot live in peace with each other?
Does anybody have any clue?
Serious answers only, PLEASE?!
2006-09-25
10:15:45
·
9 answers
·
asked by
mindrizzle
3
in
Politics