I don't mean this in an offensive or insulting way but just curious on the offchance there are some Americans here on their opinion. I still see patriotism has huge power in America, even the most intelligent, progressive, liberal says they love America. I get that it is something ingrained since childhood with the pledge of Allegiance and I understand the reasoning behind that.
I just look at America now and I find it hard to see anything to love, except some of the wonderful people I know there. I know a lot of people around the world are patriotic to the point of finding it hard to criticise their country for doing wrong... but in America it has always gone a bit past that... but I just don't think I would be proud of being American right now, personally. I feel like right now the main things people say about being American are starting to apply less and less..e.g. land of the free
Not looking for a debate, would just be interested in hearing an American perspective on their country (if there are Americans on here) cos if I asked any of my American friends/family they would probably take it the wrong way.