This is really more a cultural question than it is a religious one. However:
I think that responsibility belongs to the parents. I got my teaching credential in the late 1980's. At the time, we were all told that we should be teaching "values" to our students. My response was "fine - but they'll be my values, because I'm not going to teach them something I don't believe." I never got a decent answer to that. I feel much the same way about sex education - if the parents want to abdicate that responsibility to me, I'll take it, because the kids need to be told something. But I'm only going to tell them what I believe to be true.
For what it's worth.
2007-02-24 20:03:50
·
answer #1
·
answered by Anonymous
·
0⤊
0⤋
I believe that it is better to inform children or they will learn it some other way. Children will pick several misconception about sex if they are not taught the truth and responsibility of having a sexual relationship.
2007-02-25 04:06:21
·
answer #2
·
answered by Neko411 1
·
0⤊
0⤋
First off, I'm a youth pastor.
I think sex ed is very, VERY important. In our youth group we have cell groups, and we go over it at one point in our cell groups.
The groups are divided down by grade level and gender, so we don't have guys and girls in the same group when we're discussing things.
They need to know what's going on from a source other than their friends and TV.
2007-02-25 04:05:33
·
answer #3
·
answered by Angry Moogle 2
·
0⤊
0⤋
well God created sex, and He is alright with older children learning about sex, they sell sex books(like what it is and how it is to be used) at Christian books stores
2007-02-25 04:05:01
·
answer #4
·
answered by Anonymous
·
0⤊
0⤋
Parents should do it, it is their duty to their children. If they give it up then they should not blame Father God or the schools.
2007-02-25 04:02:10
·
answer #5
·
answered by martha d 5
·
1⤊
0⤋
For six year olds - heck no
In high school...................... sure
2007-02-25 04:22:09
·
answer #6
·
answered by Anonymous
·
0⤊
0⤋