The
world, today, has taken the view that
sex is a bad, bad thing. It should
not be spoken of in
public, it should not be done before you are good and married,
and it is a
taboo thing that is hidden from the ears of the
population. Years
and years of puritanical view has filed away at our
freedom to express what
we are naturally drawn to. We feel
shame when walking into a sex shop for the
first time. We feel badly when fantasizing about what makes us
aroused. We look
down at our feet in public when looking through books on
sexual topics and we
blush when sexual terms are brought up. Society had taught us that sex is dirty,
that
genitalia are dirty, and that sex is a bad thing to have unless you are
trapped securely in the
bonds of marriage. Our schools teach as little about
sex as possible. They only tell the children what is needed to
prevent the little
girls from becoming
pregnant, and in most cases, they don't say enough for it
to be effective.
After saying this, I have to ask, "Why?"
Why don't mothers teach their little girls when they come of age to understand
that they have a vagina and that they have a urethral orifice, and a clitoris?
Why do the classes created to inform children of their sexual nature hide
things that children need to know? Why are we never told about intersexed
babies, odd quirks about penises, interesting facts about breasts? Why are
children forced to learn all the countries of Africa and Russia and yet
are not taught about their own bodies?
As humans, sex is a natural body function. It comes as naturally to us
as urination and sleeping and eating. Our bodies are built to have sex. We are
given the hormones to tell us when we should have sex. Nature has supplied
women with a very accurate set of signs to tell them when they can and when
they cannot conceive. Why are our children not taught these things? Why do
we blush when someone talks about his or her penis or vagina?
Just the other day, I learned something that made me think much differently
about myself and about my body, and about society. I started doing research.
I read about many things that I had been told the contrary of when I was younger.
I learned that between the clit and the vagina is a second, smaller hole,
called the urethral orifice. Female humans urinate from it. Never before was
I told this. No school, no book I was given, and no adult informed me of this.
I never looked because I was taught that the female genitalia only had this
and that and nothing else, and that it is a bad and dirty thing. I never believed
that there was something else down there.
Why was I never taught? Why do so many children have sex and get pregnant?
The reason is that society hates sex. Society wants sex and anatomy of the
genitalia to be hush-hush. There are children out there who believe that if
the male puts a wristwatch around his penis during sex, his female partner
cannot get pregnant.
Why don't we teach people the right way to do things? Why is sex a bad
thing? Why are homosexuals and bisexuals and transsexuals thought to be
bad in the majority of society? What possible reason could society present to
us to say that we shouldn't teach our children about their urges, about how
to be safe, about what their bodies are telling them, and what everything is
used for? When are we going to come to terms with the fact that adolescents
are going to have sex and start teaching them how to have sex the safe
ways, instead of telling them half-truths that lead to guessing and trial-and-error?
Sex is not bad. Sex is a beautiful and wonderful way to have full body
exercise, relieve stress, and show someone you love them in ways words cannot
express. We are animals, and we are meant to have sex. It is a sharing of
nature and intimacy and it should not be hidden.