this message may be offensive
i know that i do not at all have a huge platform but please spread this. Women in America are getting their rights stripped away. if they get raped they are forced to give birth to the child. Some of these people that are getting raped are literal children. They can't take care of a child. They have a whole life ahead of them. They need an education. They can't care for a child. Not only that, they could die during child birth. It's ridiculous how men think they have the right to speak about women's bodies. Not only are men speaking about it, there's WOMEN fighting to take WOMEN'S rights away. I don't care what you have to say, women dont have a choice on what other women do with their body. I know my platform is small but hell I'm going to speak up.