The Left’s Democrats Embrace Racism

Have you heard that America is racist? “Systemically racist”? How many times this year, month, week, or day? Leading Democrats espouse that, and the media, and academia, and Hollywood all bob their heads in agreement. The Left wants you to believe that to make a case for everything from reparations to throwing away the Constitution. Yet…