This is not some anti-war statement.
But I have noticed a trend. When republicans are in power the country generally does well but the people don't see it because the people are getting screwed.
When the Dems are in power the people are generally happier but the country generally falls into the toliet.
So what is more important. I stress this because I hear people down the social policies of Europe by saying that they are unable to compete or have slow moving GDPs. But the people themselves have more liberties and are more heavily invested in.
Likewise the OPEC nations are usually on the up and up economically. But because oil drives the economy, not the people, the people suffer to the point where strapping a bomb to their chest is a legitimate option.
So what is more important America or Americans? Why? and is a middle ground possible?
2007-02-26
02:33:42
·
5 answers
·
asked by
gatewlkr
4
in
Politics & Government
➔ Politics
I agree whole heartedly with you Blackd...
Man this question must be too complicated for most of YA... maybe I should turn it down or go all excentric and become a demopublican
2007-02-26
02:44:27 ·
update #1
I mean democrat or republican.
Damn independent thought. Wait maybe I'll have more friends that way.
2007-02-26
02:47:58 ·
update #2