Is it me, or does it seem like EVERY fuckin body in america has an opinion about EVERY fuckin topic? Especially concerning politics. Since when did everyone in the country become a scholar on world politics and economics? Why does everyone think their opinion reigns supreme?
I, for one, do not care about your opinion, your mom's opinion, your friends opinion.. I dont care about anyones opinion anymore. Cuz everyones an expert on everything apparently, so whats the point of even debating anymore?
This isnt a siccness observation by the way. Its something Im noticing EVERYWHERE.
By the way, does anyone else think political opinions should stay out of the workplace? Do I really need to hear how the CIT lady at my work feels about universal health care?
I, for one, do not care about your opinion, your mom's opinion, your friends opinion.. I dont care about anyones opinion anymore. Cuz everyones an expert on everything apparently, so whats the point of even debating anymore?
This isnt a siccness observation by the way. Its something Im noticing EVERYWHERE.
By the way, does anyone else think political opinions should stay out of the workplace? Do I really need to hear how the CIT lady at my work feels about universal health care?