
After the Civil War almost all blacks became Republicans—the party of Lincoln. So what happened along the way?
Why are they now almost all Democrats?
And why did Democrats go from demonizing blacks to loving them?
Why did the Democrats go from the party of racism to the party of civil rights?
How can Democrats deny their racism? That they brought in the Ku Klux Klan? That it was the Dems that were the slave owners and the Big Bosses of the slave plantations?
I suppose most Democrats will say that they have changed. Really?
What Happened?
Okay, let’s talk about what really happened. After the Civil War most blacks became Republicans. But they were greatly demonized and abused for it by the Democrat party. And Woodrow Wilson really put a scare into them through the Ku Klux Klan. That scared some of them to vote Democrat.
Then under FDR…
View original post 613 more words