I consider myself a Christian and I am trying to grow as a Christian and obey the word, but they so grey areas of Christianity that I don't understand. Why when someone that does you wrong does the Christian have to find a way to forgive the wrongdoer. For instance, someone molest you as a child, you have to forgive the perpetrator. If a father or mother abuse you as a child and when you get older than was never there for you and then you get successful in life, why is the Christian that have to reach out the try and form some kind of relationship with you parents and try and find a way to forgive them? Another example, a boss ask you develop some ideas to improve a job that he is in charge of and you do it to the best of you ability and you boss present it to his boss and not only get a raise for it but get a promotion. Why do you have to forgive him?
It seems like Christianity makes you go against your natural feelings and natural self.
2007-09-14
03:26:13
·
10 answers
·
asked by
Penny
1
in
Religion & Spirituality