I was talking with a friend once, and he told me that (I forget exactly how this came up in the conversation) once you lose your virginity, you'll realize how completely pointless it is, since your first time is probably going to be something of a letdown from what you've been imagining your whole adolescent life.
If this is true, why is every aspect of society trying to tell you that you're absolutely worthless if you haven't been laid yet? Does sex mean something profound (aside from how incredibly fun it is) or nothing at all?
2006-10-18
22:47:25
·
12 answers
·
asked by
Anonymous
in
Singles & Dating