There's something that I need to say, and its been a problem for much too long.
I look at the world around me and I don't like what I see. I am so entirely sick and tired to death of the way that women are treated in this world. But most importantly America. This country that I live in is supposed to be one of freedom and hope right? People come here to escape the horrible lives that they have had to live. They come here to start over and find something better. But is it really any better? Or are we still just fighting the same battles, dealing with the exact same things, but just on somewhat different levels?
I see people fighting for every right you can think of, picketing the streets, and saying "Enough is enough!" for every type of issue. Yet I don't see nearly enough being done to stop the rapes and mistreatment of women and children. Rape is still that topic that no one wants to talk about. If mentioned it can silence an entire room. When really the mention of that sort of brutality should cause an uproar to fill a room. We should be getting angry and saddened for all who were hurt in this most unforgivable type of way. And that raw emotion should cause more people to stand up and do something about it. How can you forget that these hurt souls are your grandmothers, mothers, sisters, cousins, and friends? By being silent we are condoning the actions of the men who felt that they had the power to try to ruin their lives.
I hear men and women make jokes about sexual assault, I hear songs on the radio that say that women don't necessarily need to give consent because there are "blurred lines" and that as men know what we really want despite what we might say. I hear victim blaming everywhere I go. I myself being told to "Just get over", "You have to just move on" or "Well you should have told someone then". So then your insinuating that this is my fault? That a grown man, a father held no responsibility for raping a child, but I just a four year old little girl somehow was responsible for not only making sure it didn't happen but also that it didn't continue? But when your a small child and you are hurting who do you turn to? Who is supposed to make it better? Who is supposed to protect you and get the bad guy? What if daddy is the bad guy? Then who can you trust? Who will protect you? In that case who is it that you're supposed to run to?
Through everything I have gone through somehow I still maintained this little bit of faith in humanity and despite the horrific actions of some I try to hold on to that, however everyday that I live and breath that hope diminishes a little bit more and this world gets a little bit darker. I want to believe in the good in people and I want to believe that not every man is evil.. However what I see with my eyes and hear with my ears tell a different story. Instead of teaching young men how to be decent human beings and treat women with respect and that they have a responsibility to protect women, boys are taught how to have sex. By high school they know how to put on a condom and they know about STD's but why don't they know about what constitutes consent? If we teach them and show them that women are just meant to fulfill sexual desires and then we teach them how to have sex then how the fuck are they supposed to learn how to be decent men and value everyone and treat women with kindness? All this talk nowadays is wrong. We shouldn't be teaching men not to rape or women not to get raped. We should be teaching all children about the value of another and that no one has the right to physically harm you, force themselves on you, or guilt you into having sex with them. Telling girls that they should wear some fucking anti rape clothing is saying that we actually have the power to control whether or not a man rapes us. If we had that power we wouldn't have been raped in the first place. We would have told them no and walked away. However that was taken away from us. Blaming the victims is basically saying that men are not even human. That they function purely off of animal instinct and have no control over the uses of their own penis'. That a girl could walk by and without any thought whatsoever he would be on top of her and have absolutely no fucking idea how he even got there. So men are you animals? Do you not have the ability to make decisions on your own? Because if you disagree with this logic then victim blaming should piss you off as much as it pisses us off. It should cause you to stand up and fight for the rights of women. To show the world that you are more then just a walking sperm bank.
Somedays I picture that life that every girl does, husband, kids, nice house, a dog.... Then reality sets in and I realize that the reason I haven't found that is that there are so very few good and decent men in this world. I'm starting to be okay with the fact that I may just be alone, I may never find him, and there are other ways to become a mother. Because I would rather be alone forever and be that weird old lady on the block that is always alone and has a bunch of cats, then to be that woman who married for the sake of not being alone and got trapped with someone who can't even respect her. I deserve better and if I can't find a man that can give that to me then I can be alone. A lot of women measure their worth in their men, their kids, their houses and that perfect life that we all dreamt about as children. But we are more then that. And until we realize that ourselves how can we expect anyone else to? I will not wait for a man to tell me my worth or what I deserve, no matter how amazing he is. I will know myself.