I ask this as a Christian, because from reading the book of Galatians it seems like the apostle Paul makes it fairly clear that there is no need for the Christian to circumcise anymore, and Paul even seems to imply that if you are still circumcising, you don't understand what the message of the gospel is all about.
So why is it still done SPECIFICALLY by some Christians? (I'm not asking about other religions, but specifically Christians.) Is there a Biblical reason, or is it only done for a medical reason? And if so, what are the reasons?
Note: I'm NOT asking to offend or anything... I'm just curious if there is a reasoning behind why some Christians still have this done (I've gotten into the habit of adding disclaimers to my questions, as I somehow manage to make people angry with my questions, even though all I'm doing is seeking out information.)
Thanks.
2006-08-08
04:01:03
·
38 answers
·
asked by
Rob
5
in
Religion & Spirituality