well i'm agnostic and i think in some sort of way they feel they are right when it comes to slavery
2006-12-30 09:56:21
·
answer #1
·
answered by Anonymous
·
0⤊
1⤋
Southerners are educated, too. Of course, they know who won the war. If the South had won, probably the flag would be the Confederate flag, logically, but I doubt very seriously if there would be slavery. The war was NOT fought over slavery, but over states rights, you see. Slavery was just a less than secondary factor. Most people and households did NOT even have slaves in the South. Only large landholders and plantation owners had them. The general or just regular Southerners did NOT have slaves.
2006-12-30 18:00:27
·
answer #2
·
answered by ruthie 6
·
2⤊
0⤋
The War of Northern Aggression was not fought over slavery ! Read a " US History" book and learn something for yourself , The Truth !! It appears all you know is what someone has told you , or you picked up some false info: during a gossip chat !! You certainly have no history education on the subject ? Southern Christians know it doesn`t make any difference who won the War of Northern Aggression ! Southern Christians know that JESUS won the only victory that means anything 2000 years ago , nothing else matters !!
2006-12-30 18:14:02
·
answer #3
·
answered by Anonymous
·
1⤊
0⤋
Do southern Christians believe they won the civil war?
NO.
Just a FEW of the general public do.
"There is a saying in the deep south....."The South Shall Rise Again".
I heard it a number of times while traveling.
( I am a Canadian living in Canada...have traveled and spent some years all over the USA)
2006-12-30 17:56:20
·
answer #4
·
answered by whynotaskdon 7
·
1⤊
0⤋
I'm from Tennessee, and my dad's a Methodist minister. And I'm an atheist.
This question is a subtle but clearly prejudicial remark about southerners. No Southerners believe they won the civil war. While racism is not conquered in the South, it is not conquered anywhere in the US. Rodney King was not beaten in Birmingham; he was beaten in Los Angeles.
If you consider yourself a person of spiritual depth, then I encourage you not to paint individuals with negative stereotyping. Otherwise you're simply engaging in the same behavior you've criticized others for.
.
2006-12-30 18:03:15
·
answer #5
·
answered by NHBaritone 7
·
1⤊
0⤋
I don't know of anybody who is delusional enough to think that the south won the civil war. And what has Christianity got to do with it? That's like saying, "Do Christians believe that eating oatmeal will lower cholesterol?"
2006-12-30 17:56:20
·
answer #6
·
answered by NONAME 7
·
1⤊
0⤋
there might be a confederate flag for the southern states, but slavery would be long abolished i can imagine.
any true christian knows that racism and slavery is just outright wrong. and ANYONE who says they're christian but they're racist or something is not a true christian at all. and the Bible makes it cleary in 1 corinthians (i don't remember the chapter and verse) what a true christian is.
2006-12-30 17:54:37
·
answer #7
·
answered by Anonymous
·
1⤊
0⤋
That question made no sense. What was the point of adding the word 'Christian' into your question? Are you trying to insult Christians, or was that just a mistake? Word your questions better, otherwise it makes you look stupid.
2006-12-30 17:53:43
·
answer #8
·
answered by L-dog =) 3
·
2⤊
0⤋
I am a Christian and I live in the South and have no idea what you are talking about. Jesus loves you.
2006-12-30 17:53:46
·
answer #9
·
answered by cindy j 3
·
1⤊
0⤋
you f@ggot, we all know the Union won the civil war you dim witted bastard
2006-12-30 17:52:27
·
answer #10
·
answered by Anonymous
·
1⤊
0⤋