Maybe its just me, I was rasied around livestock and horses. I've witnessed many births, human and animal, and I don't see the need to medicalize such a natural process. I don't understand why women are encouraged to fear pregnancy and birth. I'm not saying bad things don't happen but if it was as dangerous as the medical "experts" would love for us to believe, the human race would have died out a LONG time ago!
What do you think?
2007-02-27
11:06:43
·
8 answers
·
asked by
Anonymous