I think it started in the 1970s. People didn't just want to be tolerated, they wanted to be approved of.
At the same time, people started "legislating morality" to a greater extent. The idea of "hate crimes"--making an offense worse if it's motivated by racial, religious, or gender prejudice in the offender's mind -- is one example.
The distinction between "legal" and ''moral" began to erode.
This was actually aided and abetted by conscientious good people who felt it was not their place to judge other, that judgment should be left to the courts and to God.
2007-03-21 10:56:04
·
answer #1
·
answered by The First Dragon 7
·
0⤊
0⤋
i dont think all americans or even most believe that what is legal is "right' so stop making ignorant stereotypes.
2007-03-21 17:18:04
·
answer #2
·
answered by Gone, Gone, Gone. 4
·
0⤊
0⤋