What in the world is going on?
I don't mean to be a downer, but I just have to get my thoughts and feelings out there about this particular issue.
How is it that I still live in a world in which women are treated like sexual objects, inferior to men, or even sometimes as mere slaves, existing only to please or serve men? As an American woman, I have long believed that I am supposed to be treated as the equal to any human being on the face of this planet. But then things pop up in the news and I am reminded that maybe I am only living a pipe dream. And truly it leaves me sad and frustrated. Not just for me, but mainly for the women of the future. Where are we headed as a society if we don't value women and everything they have to offer?
I have a husband who treats me as the valuable and precious being that I am. And I want to believe that every woman deserves that, every day. Not just the lucky few who find real the real men in this world...
But the main thing I see that really gets to me is this: Why are there so many women who are willing participants in this idea that women should accept any kind of cruddy treatment?! Every time you hear a story about women being degraded by men, there is almost always a woman who is right there in the middle of it participating just as much as any man in the situation.
Again I say, what in the world is going on here?