This is a topic that's been bothering me lately.
People are always telling you to be honest. Your parents, teachers, other people, you get the idea.
They say, "Be honest! Tell the truth."
Well, whenever they ask me a question (such as "how do you feel about this, that, ect...) I answer them in my honest opinion.
Example:
Yesterday my family all went over to my Grandma's house to watch a football game. I can't stand football so I wasn't super excited about going.
So while we were all sitting around with the game on, I was ignoring it while munching on some chips.
From where I was sitting I couldn't see the TV which was fine by me. My grandma suggested that me and my brother move the couch so we could see and I said,
"Oh I don't care if I can't see. I don't even like football. I only came here for the food."
That got some laughs and without really thinking I added, "Actually I came because I was forced."
Now my grandma laughed but my mom was giving me The Look.
So I went, "What? It's the truth!"
Then I went on to rant out loud how I'm always being told to be honest but when I am everyone gets mad.
Fortunately for me everyone thought it was pretty funny so *phew*
But seriously! I am told to be honest, and when I am, everyone gets offended.
Does someone have the answer to this?
And yes, I understand the difference between being honest and being rude.
I'll admit, there are times when I'm rude, but most of the time I'm just being honest.
So yeah, random rant is over.
YOU ARE READING
Writer's Block (My Book of Rants)
RandomRandom rants and other stuff. Basically what I'll be doing when I have writer's block.