America is a racist dystopia, I'm not talking about everyone but most of the americans I encounter are racist in some way and deny being racist when they blatantly are for example "I didn't know that word was racist" or "everyone at school says it" that doesn't make it okay for them to open their big mouths. I know all countries have racists but I think America is the worst, imagine being called ghetto just because your black..