Personally I think the sex ed curiculum that is being taught in alot of schools, should not be taught to our kids. I think teaching our children about sex is the parents job. I think if the schools are going to teach anything on sex ed, I think that it should be the anatomy, but I feel that it's not the school's place to teach kids about oral sex, anal sex, and homosexuality.
Just curious about how other parents think about this.
2007-03-13
12:56:44
·
2 answers
·
asked by
Bryan M
5
in
Pregnancy & Parenting
➔ Parenting
What I'm talking about is what is being taught to kids in schools today. When I was in school, those things were not taught in school. But in schools today, kids are being taught about homosexuality, and kids are given condoms in the schools without parental concent, the schools basically tell the kids we would rather you not have sex, but if you do, here's a condom.
They teach safe sex meaning use protection, instead of the safest sex and that's not having it at all, by abstaining, and applaud any teen who makes that choice.
2007-03-13
14:04:47 ·
update #1