🍪 En cliquant "Ok", vous acceptez le stockage de cookies sur votre appareil afin d'améliorer la navigation sur le site, d'analyser l'utilisation du site et de nous aider dans nos efforts de marketing.
Can somebody that does not live in the US tell us what they think about the United States of America? I feel like in movies the United States is portrayed like guns and burgers. Is that how you guys see us or is there any other stereotypes that people say about Americans?
Martha, are you saying that America is not guns and burgers? Wow, it absolutely is guns and burgers. There's no way it's not. That's not a stereotype, that's the truth.
My take on America is that it's like a pimp-ho relationship. Basically America is the pimp and everybody hears hoes and if you're a black person in America you walk around frustrated because you see white people living life you should have lived because you didn't get paid for that 40 years of slavery.
Also, I think there's a bittersweetness about being in America. Like, you love America because, you know, it's a melting pot and every race pretty much built it. But then you hate America because only one group benefits. And I love white people, but this shit is fucking suck.
From my time and understanding, America is the freest country in the world. There is no other country compared to America in the sense that we are allowed to do, be, and say, and act however we want. How is that not beautiful?