These days, women seem to be so removed from what it is that makes us women. I am all for breaking through the glass ceiling, but there are certain things that we need to embrace as women, including our menstrual cycles. And who says? Well, mother nature for one, or how about the innate intelligence of our bodies, developed over millions of years of evolution? Our bodies' intelligence truly goes beyond what we can comprehend... and yet we continue to suppress its actions on a daily basis. We have drugs to take away the pain, and much of the experience, of childbirth, formulas so we don't have to breast feed our children, and now pills so we don't have to menstruate? Something isn't right here.
I recently had a conversation with a friend of mine who had a baby a few months ago and while on vacation in San Francisco, of all places, was scalded for breastfeeding her child in public. She was completely covered, and knowing her, being very tasteful about the whole process, but a woman decided it was rude of my friend to do something like that in front of her boyfriend and clearly stated her opinion. Is our culture so preoccupied with the sexual nature of breasts that we've completely forgotten their purpose; to give sustenance to our children?
I realize as women, we have long been fighting for equality, and I certainly do not take that for granted. But why must this entail losing everything that makes us women? We have spent decades suppressing our femininity, which in my opinion holds a great deal of power and wisdom. We cycle in sync with the moon, we bring human life into this world, our bodies producing the milk that sustains and nurtures that life... I am woman, hear me roar :)