I do care about wrongdoings of any country. The issues with police violence, racial discrimination and gun violence in the US are not a secret. You are not the first one talking about that. Everyone in the west talks about that, it’s now the first thing most people think of when talking about the US in the west.
That being said, I don’t understand why, when anyone even tries to say “China bad” the answer is always “but America is also bad”. Why does that make it right?