I have recently read a number of books that really shatter some socially accepted myths about us and our bodies - that we are designed to give birth (we aren't, we're designed to be live just as much as men. We're just capable of giving birth. No one says men are designed to ejaculate!) and that we are inferior to men. There is a lot of wisdom in our bodies, and I've loved reading these books. Has anyone read any others?
Give birth like a feminist
Pushed, the painful truth behind childbirth
Women's history of the world
The Better Half (why genetically female humans are better at being born and living healthfully!)
Can't edit this but another point - our uteruses don't exist to make babies. They exist to keep us healthy and are an important part of our bodies and hormonal cycles regardless of pregnancy!
Woman - an intimate geography