On American soiciety, today, liberal media, still, though not by so large a margin as in the last three decades of the 20th century. Christianity has had something of a resurgence, but, while most Americans identify as nominally christian if pressed to claim a religion, most are not that devout, and only a minority (some 30 million, on the outside, IIRC), are evangelicals who consider influencing politics an important way to advance thier religious beliefs.
Historically, though, when you talk about the foundations of our society, centuries of christianity have done much more to shape the western society than liberal media could have accomplished in the last 50 years.
2007-11-02 07:38:07
·
answer #1
·
answered by B.Kevorkian 7
·
0⤊
1⤋
In terms of history? Christianity. It's hard to imagin the world without the effects of Christianity and its counter effects.
In terms of current culture and society, I'd have to say the liberal media. The overwhelming paradigm of current moral compass is relativism. Which brings about the effects of the liberal media.
2007-11-02 07:38:31
·
answer #2
·
answered by Traveler 5
·
2⤊
1⤋
Tough question (in America anyway). Historically, it's been religion. Our country was founded with christian morals as a major driver. But recently the media is playing more and more of a role due to the decreasing role of religion in people's lives.
I would still say that christianity still has a greater influence, but this might not be the case anymore in 20 years when the next generation comes around.
2007-11-02 07:30:21
·
answer #3
·
answered by HokiePaul 6
·
3⤊
3⤋
The percentage is just higher for Christianity, that does include all the liberals that believe in faith too. In reference to the liberal media they are using a minority status to influence their agenda. They mirror religion to a tee, the correlation between the two is remarkable. However, they have the statement "separate church and state" on their side. Which is a disadvantage for the religious side.
2007-11-02 07:35:39
·
answer #4
·
answered by rance42 5
·
1⤊
3⤋
Since I believe liberal media is a myth I would have to go with Christianity. Ignorance really plays the biggest role in our society but that goes hand and hand with religion.
2007-11-02 07:43:17
·
answer #5
·
answered by Anonymous
·
0⤊
3⤋
Obviously Christianity, since "the liberal media" is a myth. If the media is so liberal, Bush would've been exposed as the unfit-for-the-Presidency liar he is back in 1999.
2007-11-02 07:36:36
·
answer #6
·
answered by Anonymous
·
1⤊
2⤋
what? I'd like to say christianity, but after all the Jesus jokes I've seen on almost every famous comedy (movie and cartoons, south park etc.) I doubt there is much respect left for CHRISTianity. And if that doesn't do it, the growing acceptance of Darwinism sure has. Besides, people would rather hear "do what ever you want" than, "obey me and I'll give you a reward".
2007-11-04 01:18:34
·
answer #7
·
answered by Jerry 4
·
1⤊
1⤋
Thats a good question, I'd definitely say Christianity. Liberal Media has not been around nearly as long as Christianity.
2007-11-02 07:31:10
·
answer #8
·
answered by Anonymous
·
3⤊
3⤋
Yes, tough one, but good question. I think anymore liberal media might have a greater influence because it either agrees or disagrees with some people and everyone relates to it because of the issues in today's society.
2007-11-02 07:31:53
·
answer #9
·
answered by Workcompguru31 4
·
1⤊
5⤋
These days I would have to say the Media-most of which is liberal, but some are conservative...they both have more of an influence....I agree w/ the first answerer that it used to be Religion, but not these days. The media seems to be everywhere, even more so than Religion....Now there are some religion extremest that are out there, but most understand they are extremest and pay no attention to them!
2007-11-02 07:33:58
·
answer #10
·
answered by tll 6
·
2⤊
3⤋