I'm really done with America, like we have to put up with so much sh*t from them, homophobia, gun violence, racism, Trump... can we just put them in the naughty corner and ignore them until they learn to be better? I feel like every day I am seeing something in the news from America that makes my heart sink. I feel sorry for the poor Americans who hate what they see going on around them but have to live it.