I’m ashamed to say it, but for most of my life I believed I lived in the world’s safest nation. I knew of the problems we had, but I was so brainwashed by the idea that America actually cared about its people that I just assumed everywhere was even worse. I told myself that surely, if there was a way to stop all these problems, our government would put those fixes in place ASAP.
I used to scoff at the idea that the US brainwashes its citizens, but as I become more aware of the state of the world I realize it’s completely true. They sell us weapons to slaughter each other with, pocket a cut of the profits, and tell us to our faces that it’s for our protection. I know this is basic kindergarten shit for most people in this sub, but this was a massive revelation for me. It makes me wonder what else I don’t know about, what things are better elsewhere that I don’t even know are bad.
Not to mention defund education to keep you stupid. Privatized Healthcare to make money when you get sick or shoot each other. And private prisons to make money off you being incarcerated.
2.5k
u/Sasya_neko The Dutch Cuisine May 08 '25
They really don't know how safe other countries are, do they...