English Deutsch Français Italiano Español Português 繁體中文 Bahasa Indonesia Tiếng Việt ภาษาไทย
All categories

I assume Republicans are considered conservative, while Democrats are considered liberal. Yet, what do these names and terms mean in terms of social issues, i.e. education, wealth distribution, race, size of government, crime, gays, corporation, the environment, etc. I am very interested in understanding what each "side" thinks, in general, and how they operate to affect the nation.

2007-03-07 00:58:47 · 6 answers · asked by mezhenari 2 in Politics & Government Other - Politics & Government

6 answers

A conservative believe that the individual must serve the system (church, family, government, company).

A liberal believe the system mush serve the individual.

In other worlds, do people live for society or does society exist to serve people.

Obviously both have applications in different situations. Personally, I believe that the liberal view should dominate as Jeus said the law was made for man, not man for the law. Ergo, he was a liberal, although he did serve his church and humanity and saw the need for conservativism also.

2007-03-07 01:09:28 · answer #1 · answered by Fancy That 6 · 2 1

Research has shown that there is a lot of overlap in beliefs among the groups, and that a lot of the perceived differences are in 'marketing' and 'packaging' of ideas.

For example, most Americans support protecting the environment, protecting gay rights, having a social safety net and providing public education. At the far right, you'll have people who support none of that.

The right and left in America basically correspond to fascism and to socialism. (But the right wing can't face that about themselves.) The right wing harkens back to 'the good old days' (that didn't exist) and promote their idea of religion for everyone, equate patriotism with militarism and back the 'rights' of corporations to do whatever, whenever to everyone, because they've 'earned it.'

The left in America was reigned in by FDR, who modified capitalism enough to sand off its rough edges while retaining its power. The left in America also is associated with expanding the notion of civil rights - applying the guarantees of the bill of rights - to All Americans - while the right fights to retain privilege for the old guard - rich, white, protestants and the wasp wannabees.

2007-03-07 01:06:23 · answer #2 · answered by cassandra 6 · 1 0

"Conservative" and "liberal" used to have real meaning. But now they are merely buzzwords used as verbal props in campaign commercials and talk radio shows.

2007-03-07 01:11:57 · answer #3 · answered by Timothy B 3 · 0 0

The answer that will be acceptable to you will depend on your political beliefs. Sad, but true.

2007-03-07 01:18:08 · answer #4 · answered by Anonymous · 0 0

""conservative" and "liberal", and Dems and Reps"

It's all childish namecalling to avoid discussion issues.

2007-03-07 01:03:55 · answer #5 · answered by Anonymous · 1 0

Liberals are generally associated with the Democrats and Conservatives with Republicans.

Liberals believe in big government, high taxes and low moral standards.

Conservatives believe in limited government, low taxes and a moral society.

Basically liberals are evil and conservatives are good...lol

2007-03-07 01:03:43 · answer #6 · answered by JHE123 2 · 1 7

fedest.com, questions and answers