In our male-dominated society, almost everything we do are defined by our relationships with men, even our identities. Women are taught from a very early age that the idea of a relationship is a love connection between a man and a woman, and our job is to make men happy. We're supposed to be soft be