this message may be offensive
Alright, guys. Time for me to get my thoughts out.
Women are treated better than men. Very controversial, I am aware. However, let me speak. Most women take feminism as degrading men, which a lot of non-male society is okay with. But, men can't degrade a woman because that's sexist and not okay. The only difference between the two is gender. I saw a reddit post from a man who had been raped. It talked about how he had searched for help and was unable to find any, essentially told to 'man up', and then left alone. But all the ways he searched for help offered that help towards women. Men can be raped too. I saw a comedy skit where it talks about how a woman could go up to a man and his son and say, 'Oh, he's so cute. I'm coming back for him when he's 18, save him for me.' Then walk away. However, the skit continues. He then dares any man in the room to do that to a woman and her daughter, the man would be escorted out. Women want to be treated equally? Well then, they're going to like it less. Just like how a shoulder is turned when a black person is racist towards a white person, because yes, that's possible, but when a white person is racist towards a black, that's not okay. White men, such as myself, are treated worse in today's society, and I'm sick and tired of people pretending they aren't.
That's my TedTalk. Thank you.