English Deutsch Français Italiano Español Português 繁體中文 Bahasa Indonesia Tiếng Việt ภาษาไทย
All categories

22 answers

Yes, but in their mind they are doing the exact opposite. It is an argument that will never be solved or go away.

2007-12-07 05:16:19 · answer #1 · answered by Anonymous · 3 4

Do you really think that liberals see the changes they want made as ruining anything? I think the country can only be strengthened by some of the things I'd like to see done, the same way conservatives do. We all want to make the country better, we just disagree on how to do it.

2007-12-07 05:25:49 · answer #2 · answered by leonorfarfan 2 · 3 1

Now, as a conservative, I find that "liberals" sincerely believe the changes they espouse will make the country better.

But I also believe that the changes they think will make the country better will not have that result. In fact, they will do untold damage to this country. Each step we take that does not fall under the Constitution, such as increased "entitlements" or government controlled health care, increases the power of government over our lives. Increased government power has never, in the history of the world, not resulted in decreased freedom and liberty for the people.

History is replete with examples of this lesson, yet many fail to learn it. Or, in seeing it, declare it to be inapplicable to them; but they're wrong.

2007-12-07 05:30:28 · answer #3 · answered by Anonymous · 2 1

NON- SUBJECTIVE ANSWER:
This country HAS become more liberalized.
SUBJECTIVE:
And judging from these boards I feel that noone is truely happy with this. So yes, i feel liberism, has ruined a more gentle, conservative world.

2007-12-07 06:51:27 · answer #4 · answered by Anonymous · 1 0

This country always changes that why we can add amendments to the constitution our founders knew they didn't have all the answers so they gave us a government that can be changed and liberals just want it to be done in the spirit that our founders would have wanted, justice and equality for all not for the privileged few who seem to hold power now.

2007-12-07 05:20:44 · answer #5 · answered by region50 6 · 2 3

Change it? Probably.

Ruin it? No.

Examples: My mother, whom I respect tremendously, is a liberal. While we disagree on most political issues, we agree on many other things. I thing she would like to change our country, but she would want to do so for the better.

Would she and I agree what changes are needed? Not likely. But even though we disagree, I know she wants what is best for our nation, just like I do.

2007-12-07 05:16:43 · answer #6 · answered by ItsJustMe 7 · 5 2

Liberals can't ruin a country already destroyed by the Republicans and the Bush crime family. Why are you being a troll?

2007-12-07 05:16:27 · answer #7 · answered by sky64 5 · 3 4

Instead of parroting what you hear others say, it would be more interesting if you explained to us how you think liberals will "ruin" the country.

2007-12-07 05:17:00 · answer #8 · answered by Anonymous · 4 2

Liberals created this country, conservatives like Dick and Osama want to destroy it.

Clinton had the country running well, GW has ran us over a cliff. You need to buy a clue.

2007-12-07 06:25:23 · answer #9 · answered by poet1b 4 · 1 3

the severe magnificent got here approximately or awakened because of the severe left. yet liberals are no longer acquaintances of the US. they'd injury this u . s . just to regain power.

2016-11-13 23:45:11 · answer #10 · answered by ? 4 · 0 0

Yes. Most people know this. The democrats in general want nothing more than to come to power. They really don't care what happens after that, they just want to be on top.

The liberals want to change the country from what it was originally meant to be, back to what it was we came over here to escape from. This is inherently bad.

2007-12-07 05:18:08 · answer #11 · answered by Sam64 3 · 2 5

fedest.com, questions and answers