Has this become a forgotten notion in our society these days? What happened to respecting each other's views/opinions, no matter what we all believe in?
I'm still young, so I don't know if it has always been this way, or if things have deteriorated over time, but we seem to live in a judgmental, close-minded world. When disagreement sparks, people take offense to it, for whatever reason. They feel that they're being attacked so they feel the need to put down others for what they believe in. They ridicule, criticize, and judge. Who is "they"? It's not just one group. It's society as a whole.
For example, I'm LDS (Mormon) and I've sometimes been told by others that I am not a Christian. I feel completely disrespected when I hear this, because I know who I believe in. I know what I am better than anyone else. We all know ourselves best.
Is it necessary to go around telling others what they are, and what they aren't?
Live and let live. No matter what you believe.
2007-04-20
20:35:42
·
19 answers
·
asked by
Daniel
4
in
Religion & Spirituality