English Deutsch Français Italiano Español Português 繁體中文 Bahasa Indonesia Tiếng Việt ภาษาไทย
All categories

23 answers

Not all christians. There are many chrisitan denominations that actually believe that the church should have no part whatsoever in government and political matters. Now, the catholic church, believe it or not, does want to dominate every aspect of life and it already has an amazing control and dominance over government. Just that we dont know it or are not aware of it. I can guarantee you that very soon, freedom of religion will be limited, not only in this country, but all over the world. And that will be an order that will come from the head of the Catholic church, the pope. The catholic church will proclaim itself (already has) the one and only true religion and it will have complete dominance....everywhere. Just watch closely.

2007-07-27 07:54:56 · answer #1 · answered by Diamantez 2 · 1 0

We already do dominate the world, and all of the USA. Christians have been the dominate force in the USA ever since it was founded and we still are. Not so much because we want but because we are the majority and always have been.

As far as the world though we are the dominate force, but we do not dominate the world, as such. That is, our views are not forced onto anyone and we don't try to.

2007-07-27 14:52:18 · answer #2 · answered by Anonymous · 1 2

It is very interesting that you used the word "Dominate" in this question...
To answer this correctly we must first remember that Christianity, as a religion, or spiritual way of life, is basically to lead a Christ-like life, one of Love, and Peace, and Harmony, and Charity, and one of Hope that all who practice will find spiritual fulfillment. Yet there are many organized religions that call themselves Christians but are not. It is true -as so many will witness and testify to- that most churches are nothing but breeding grounds of hypocrisy and prejudice, which is definitely not Christ-like. Still, they call themselves Christians...
A true Christian, he who attempts to live that Christ-like life, finds that organizing into groups is not compatible with the basic ideology of Christianity which is perfectly suited to the solitary practitioner as well as being spread and taught by example rather than exhortation of outdated scripture that frightens rather than educates. True Christianity is very much akin to Wicca, which is practiced by the individual and strengthened and supported by the group without need of grand expensive churches or temples; the out of doors worship and fellowship is just fine for a real Christian.

I think the only thing a true Christian feels the need to Dominate is his own spiritual self and his power over the adversary.

Oh, and the Latin for Jesus Christ?
Dominus

Peace!

2007-07-27 15:11:19 · answer #3 · answered by The Mystic One 4 · 0 0

Yes. The bible states that anyone found to be an unbeliever should be killed, and that any city not predominantly of your faith should be razed, and everyone in it destroyed. Yes, Christians want to dominate the world. It's unfortunate that they already dominate the USA.

2007-07-27 15:37:19 · answer #4 · answered by The Rationalist 2 · 1 0

Some Christians might, but I see error in that thinking. How often do we have to see it in history that when a theocracy is in power, that persecutions and death come?

The Muslims are in the midst of their theocratic Laws with all their stoning and hangings, just for sex. As the rest of the world looks on in horror, the Muslims, even those who call themselves "peaceful" stand idly by and excuse it as a deliverance of their version of the Law.

We had the same in the Roman Catholic Church. They were considered the 'true church' and the blood they spilled for those centuries remains an atrocity on the face of the Earth.

The only time I would accept a theocracy is when Jesus returns. He will be the only one who knows how to keep things in line.

2007-07-27 14:58:14 · answer #5 · answered by Christian Sinner 7 · 0 0

Why should a true Christian wish to dominate this world. We are called to live seperate from this world and seek our future in Heaven. Not the kingdoms of men.

2007-07-27 14:51:39 · answer #6 · answered by Anonymous · 1 0

I cannot speak for everybody but I'm a christian and I don't. If feel like conquering the world I play Civilization or Risk. It's more fun to conquer a fake world anyway.

2007-07-27 14:52:04 · answer #7 · answered by Phil K 3 · 0 0

Good heavens no! I certainly hope not! We've already tried mixing religion and politics, and it was largely an unqualified disaster (crusades, inquisition, witch hunts, etc). Religion should be about doing good works for others, giving to those in need, and caring for the spiritual well-being of everyone, whether that includes converting them or not.

2007-07-27 15:00:23 · answer #8 · answered by nardhelain 5 · 0 0

World domination isn't how I would describe it, but we certainly want/desire that the world would be governed by Christ.

- However, we also need people to understand we're democratic about it. We don't force it on people who don't want it!

(**Their loss.**)

My personal thoughts on all of this is: I would like to see TRUE CHRISTIANS in the positions of leadership; I also would love EVERYONE to acknowledge the wonderful name of Jesus...

Realistically, I know - this isn't going to happen. Otherwise the Bible wouldn't be written the way it is, and we wouldn't find ourselves in End Times. You are watching prophecy unfold.


God bless.

(Thumbs down me, if you like --- I don't care.)

2007-07-27 14:56:09 · answer #9 · answered by redglory 5 · 0 1

We don't want to dominate anything.
God dominates us so there fore, we don't need to dominate anything.
We are not called to dominate we are just called to go tell the gospel

2007-07-27 14:51:40 · answer #10 · answered by Jessica A 2 · 1 0

fedest.com, questions and answers