Raised in the South, and was taught to be proud of it and its history. All I really see are massive amounts of industrial plants, high cancer rates, lowest quality of life in the United States, corruption you would not believe, and the religious fanatics who defend it all like it's a duty. Don't get me started on our history. There are people who are proud of it all because when they look at the same industrial plants, instead of seeing the negative characteristics, they see wealth, jobs, and progress. Life is very much how we view of the world, and it takes a lot of self inflection to really challenge the core principles that you were raised with.
People love to say that soldiers die defending freedom, but were they really defending the way of life of people like this?
One of the biggest misconceptions is that the United States fights humanitarian wars. We don't. We love to say that we do, but we don't, no nation does. They fight them out of self interest, and then attach selfless reasons like humanitarianism to make them more tolerable... WW2 wasn't fought to save Jews, it was fought on economic grounds. If Japan had won WW2, they'd have made a big deal about the internment camps and said they were fighting it out of a sense of duty to protect Japanese people around the globe.
The only way to really avoid conflict in an era which one war could mean not only the end of our way of life, but life itself, is to really challenge the notion of these black and white wars, and take a step back to try to understand why both nations are willing to send so many people off to die.