in America we do not respect positions of great importance. We would rather spend our money paying high level television executives that offer us mindless entertainment. The mindset seems similar to the spoils system, when lavish rewards were offered for government jobs, corrupt people applied. If you are a teacher there aren't many benefits unless it is what you truly want to do. So in a way it proves your devotion.
2007-11-04 16:38:56
·
answer #1
·
answered by Anonymous
·
1⤊
0⤋
IMO it boils down to just a few points:
Teaching became a profession that was acceptable for women. Women have always been paid less than men for comparable work, and teaching, unfortunately, has stayed at lower pay even after men joined the ranks.
Even though parents, communities, and even our national leaders SAY they value children, the evidence suggests that we are not willing to PAY a good price when something involves children. This disconnect extends to day care, pre-school, even pediatricians (who receive a lower average pay than most other specialists.)
School is the one thing that (almost) everyone in the country in the country has experienced, so everyone thinks they are an expert in it. That attitude extends to the idea that anyone can be a teacher, or the old (but false) adage that says "Those who can, do; those who can't, teach." In other words, teachers must be those persons who couldn't cut it in another profession. QUITE FALSE, but a pervasive attitude, nonetheless.
2007-11-05 00:27:36
·
answer #2
·
answered by English teacher 5
·
1⤊
0⤋
In one of the Asian countries, china or japan, they pay their teachers the same as their doctors
2007-11-05 01:12:25
·
answer #3
·
answered by I Love Jesus 5
·
0⤊
0⤋