English Deutsch Français Italiano Español Português 繁體中文 Bahasa Indonesia Tiếng Việt ภาษาไทย
All categories

2007-03-03 01:39:07 · 4 answers · asked by skooltransformer 2 in Politics & Government Law & Ethics

4 answers

Well, it depends. Most mandated things are evil and most evil things are mandated. For instance, in all countries, imprisonment is mandatory for certain types of criminals and even certain types of non-criminals. In some countries, like the US, it was mandatory to murder a witch or a runaway slave, a hundred or two hundred years ago.

Still, that is not to say that education is evil. It becomes evil only when teachers hate students to hate people who don't speak English or who are not Christian, and so on. Education is, in itself, a good thing. The mandatory part makes it difficult to get a good education at times, because it means that you cannot throw out a drug-dealer or rapist from a school (if the person is a student in the school) as it is mandatory for him/her to get an education.

2007-03-03 01:53:00 · answer #1 · answered by Anpadh 6 · 0 0

School isn't evil...you may not like it, but it's necessary in order to live in society. And yes, in the US school is mandatory.

2007-03-03 01:52:31 · answer #2 · answered by Enchanted 7 · 0 0

Well, there is one main goal for why we go to school and college. To get a a good job so we can make a lot of money and live happily. Mandatory, yes. Once you get to college, you go from there.

2007-03-03 01:47:32 · answer #3 · answered by Anonymous · 0 0

It is not mandatory. You could move your family to Africa and become pastoralists and be blissfully uneducated.

2007-03-03 01:44:23 · answer #4 · answered by Timothy M 5 · 0 0

fedest.com, questions and answers