So in school we are taught that we aren't supposed to be taught about the Bible because it goes against our freedom of religion. Yes we are taught this but are we really taught this or are we taught the Bible because most of everything we learn, read or hear in social studies is about the Bible. They can say what they want but it's true. I mean seriously if you sit down in a 7th grade social studies class they are mainly learning about multiple gods and goddesses but before that and after that you are not taught much more about it I mean if u want equality than u should be more open about religion and not have every song over America have a mention of God. This really drives me insane because I question myself almost all the time but do I believe in God? No. But I believe that are gods and goddesses and I don't truly believe in them ether but from a young age it's drilled into our brains to believe in God. We are taught that anything else is wrong and we will go to hell. To be fare I believe earth is more of a hell than hell ever would be.
YOU ARE READING
A look inside my head
AcakThis is where I share bits and pieces of the things I think about. This can get really opinionated and deep and I don't really write the best, but I try and I hope that what I wright can make someone feel less alone.