this message may be offensive
I don't being like this way but I have to say that in my opinion life is all about being born taking care of earth and then using your dead body to fertilizing it's soil then your child/ren do the same , I wonder if it's worth it, It seems bleak but so are most things in life. When I was little my mother told me that when she was pregnant with me God spoke to her and told her that I was put on the earth as a friend for my sister, this pissed me off. There has to be more to life then just being the friend of someone who doesn't like that. Later on in my life I found out that my mother had a vision from God when I was a baby , she said I was gonna be the high priestess of the church one day. Needless to say I laughed, the idea that someone like me would be in a high position in a church is insane. There has to be more to life than doing everything someone in the sky tells you (no offense to anyone who believes in God, this is just my own personal OPINION.) There has to be something that makes life worth living , something that makes life less empty. I have been searching for this for a while and eventually found something that makes life worth living, something absolutely amazing , something really cheesy, I found love. I know it seems lame and cheesy but you had to have fallen in love to understand it. I fell in love with someone and they reciprocated those feelings and I had to go fuck it up. I can't deal with myself anymore and I want to believe that it gets better but it doesn't , I don't see myself getting better. What a cliche, to die because of a boyfriend/girlfriend etc. I don't want to kill myself , I just want to die if that makes sense. I don't know what to do with myself anymore.
P.S:if anyone actually reads this please don't comment , I'm just trying to vent I think.