Oh for sure it's not morally wrong to think other countries are better than the United States. I honestly think people who do think the United States is the best either have a very idealized version of the United States or never have lived outside the United States for an extensive amount of time. I normally would say traveling helps but some vacations are very resort-y and insular rather than eye-opening to many Americans. Honestly if I had my way, I'd have all Americans live abroad, either via university or through public service programs like Peace Corps.
But also keep in mind, just because the United States has a lot of flaws, doesn't mean other countries don't. Xenophobia, homophobia, sexism, crime, corruption all happen to some degree in other countries, some less than the United States, some more. As a woman for example, there are a lot of countries I'd be scared to visit alone. On the other hand, there are also a lot of places in the United States I'd also be scared to go by myself. Personally, I'd like to try to make things better here in the United States. I don't have the privilege to live elsewhere and it's the same for millions of other Americans. There's a lot in the United States I don't like, but there's also a lot I do. Our National Park service, our wildlife, our diverse foods, our cultures, our welcoming of immigrants...those are the things I keep in mind whenever I feel down about the politics here.