English Deutsch Français Italiano Español Português 繁體中文 Bahasa Indonesia Tiếng Việt ภาษาไทย
All categories

2007-11-24 05:27:17 · 13 answers · asked by Camden O 2 in Arts & Humanities History

13 answers

Lizard, you said "Christianity has forced people to go extreme with their beliefs. People have changed it in the past to fit their own personal beliefs. Christianity seems to be the reason why people hate gays, abortions, and others who are not "trying to live a Christian lifestyle". In the past, it also may have had things to do with extreme racism and slavery.

Christianity is nothing but an organised system of beliefs and rules. It's caused more harm than good."

First of all, Christ hasn't forced us to any extreme. God wants us to live a life of righteousness because it is for our own good. As the word of God states in 1 Corinthians 6:12-20, "Everything is permissable for me" - but not everything is beneficial. "Everything is permissable for me" - but I will not be mastered by anything. "Food for the stomach and the stomach for the food" - but God will destroy them both. The body is not meant for sexual immorality, but for the Lord, and the Lord for the body. 15 Do you not know that your bodies are members of Christ himself? Shall I then take the members of Christ and unite them with a prostitute? 18 Flee from sexual immorality. All other sins a man commits are outside his body, but he who sins sexually sins against his own body. 19 Do you not know that your body is a temple of the Holy Spirit, who is in you, whom you have received from God? You are not your own; 20 you were bought at a price. Therefore honor God with your body.

God is only good, loving and forever. He is perfect but we are the ones that are imperfect. Man has done more to misinterpet the word of God but God's word has always been right and just and always will be.

And to answer the orginal question of how did it effect the new world, in my humble opinion, Christians weren't allowed religious freedoms in England so they came here to be able to worship God in their own way without fear of persecution. Our country was founded on God and blessed by Him. Our country has become immoral in so many ways and you wonder why we decline? We can't honor God in our schools? People have completely misinterpeted the amendment of separating church and state. Our founding fathers were all Christians and never intended to remove God from our country. They simply wanted to ensure that any of the three branches would not have the power to force their religious beliefs on the rest of the nation. Even the Supreme Court has the Ten Commandments in the building. Look at the back of your money "In God We Trust". Think about it!

2007-11-24 06:07:30 · answer #1 · answered by Anonymous · 1 0

Christianity In The New World

2016-11-07 22:40:44 · answer #2 · answered by ? 4 · 0 0

You could get two different answers to this, and depending on viewpoint, both could be true.

It had a much bigger impact on the native peoples of South America, Central America and Mexico, which was predominantly settled by Spaniards. At the time of initial colonization, Spains rulers were guided (some might say dominated) by the infamous Spanish Inquisition. They were infamous for a reason; not nice guys. From a scientific, sociological perspective, the missionaries basically forced the native people to become Christian, thereby destroying their native religions and cultures, and their identities in the process. Everything they had known and honored was gone. General imperialism and greed for riches accelerated this process, of course, but Christianity was a prime mover in the wholesale destruction of entire world class civilizations, and the individuals who made up those societies.

If you are a Christian, though, you might say those civilizations were evil; some had human sacrifice, and most had forms of slavery. They had false religions, you might say, and the world, and native people themselves, were better off without those bloodthirsty Gods. By being introduced, even at gunpoint, to Christ, their souls, and those of their descendants were saved. I wouldn't say that, but for Christians, those would be valid points.

These points also apply to Northern American civilizations, but I think greed for land was a bigger factor there, as well as a false belief that European "superior" culture had a right to wipe out inferiors. religion played a part, but not as large a part as it did in the countries that eventually became Latin America.

2007-11-24 05:51:05 · answer #3 · answered by Bartmooby 6 · 0 0

you can answer this as to your beliefs. What knd of impact do you WANT it to have had? You want christianity to come across as good, then you answer something like: The underlying belief in a Christian God in America brought about the foundation of democracy in political standing, a legal system which borrowed heavily both from ancient biblical tradition (in the codes of Hammurabi- sp?), as well as some more recent biblical traditions in the application of mercy in justice, and an economic/social system which allows individuals both assistance in times of need and penalty for monopolization (haha ha- yeah, right) . On the other hand, if you want Christianity to come out smelling sour,you go with something like this: In the name of Christianity, an entire population was mass exterminated. The "new world" wasn't new to some, but that people and culture (their languages, music, art, and way of living) was completely lost in the name of well meaning Christians with a belief in Manifest Destiny.

2007-11-24 05:43:32 · answer #4 · answered by kurtisgod2 2 · 1 0

We would have been able to read the Mayan libraries if the priests hadn't pronounced them as pagan and burned them all.
Christianity killed off all other native religions.
In saving their souls the priests made them slaves for the Spanish and to accept that life here on earth and to focus on the life in the hereafter.
The colonists in Am created the same intolerant conditions in their colonies that they were fleeing in England.
Thus the Am founders created a separation of church and state to give people a choice of religious beliefs without having to financially support the state religion.
The strict religious beliefs seem to create a character that can endure great hardship in order to create a change.

2007-11-24 05:44:10 · answer #5 · answered by Anonymous · 0 0

I still consider myself a christian til now. however, when I am attending masses, I notice that there, still the same as it was from before. except that, if you observe closely, you will also noticed that church ushers behave like goons, some of the priests utter words they only could understand not minding the herd of faithful worshipers who were not praying at all! and some of the newly appointed priests were not even sure on what they profess! or cannot even take a debate on some important subjects.

yes! I am very critical in what I observe. the only improvement I see from these is that in every congregation, there is corruption. be it money, material things even phaedophilia!

yes! I was tempted to become the opposite and almost converted to islam. But I cannot forsake the love of JESUS who will always be loyal to me.

yes, I want to remain in jesus herd.

2007-11-24 05:56:57 · answer #6 · answered by randomX1 3 · 1 0

Christianity has forced people to go extreme with their beliefs. People have changed it in the past to fit their own personal beliefs. Christianity seems to be the reason why people hate gays, abortions, and others who are not "trying to live a Christian lifestyle". In the past, it also may have had things to do with extreme racism and slavery.

Christianity is nothing but an organised system of beliefs and rules. It's caused more harm than good.

2007-11-24 05:40:56 · answer #7 · answered by Anonymous · 0 1

It had a huge impact. It went from a small following to a worldwide worldview in two thousand years. It is one of the biggest faiths in the world alongside Islam and Judaism.

Remember what Jesus said when He told His followers to make "fishers of men."

2007-11-24 05:58:31 · answer #8 · answered by chrstnwrtr 7 · 1 0

If you go to Europe or pretty much a lot of other places it's socially acceptable to go out partially nude or something. On signs you may see naked people or on TV also more often. Here in the US you wont find that because of a certain group of Christians who came here. I forgot which ones though. It's pretty much almost taboo here though to do that kind of stuff.

2007-11-24 05:31:02 · answer #9 · answered by I am Ninja! 3 · 0 0

It set the stage for tolerant colonies, states, and a more tolerant country.

2007-11-24 05:30:25 · answer #10 · answered by Matt Shank 3 · 1 0

fedest.com, questions and answers