America Is NOT Fundamentally Racist

I cannot stand how much it’s often claimed that America is a fundamentally racist country. The claim is absurd, and those spouting it are either ignorant or willfully deceptive, often with political motives.

The vast majority of Americans are NOT racist, despite the narratives that demonize them as such.

Furthermore, defending oneself against accusations of racism, is not itself racist. It’s actually a reasonable response when someone is smeared as allegedly racist, for said person TO defend themselves.