Once upon a time the Deep South of the United States voted Democrat in every election. Now it would seem that they have switched to predominantly Republican leanings. Is it because republican politicians started to appeal to those in the south that are considered racist whites? Everyone that knows their history will tell you the Democratic Party was the party of slavery and Jim Crow, and the Republican Party was the party of emancipation and racial integration. The southern democrats were the segregationists while the republicans were the tolerant northerners. However, it seems that in the 60’s and 70’s everything suddenly flipped. All of a sudden the republicans became the party accused of being racist and bigoted. What’s the truth? Why the change? The answers are found in this short educational video from Prager U.
Unless you've been living under a (safe, cozy, comfortable) rock, you've probably noticed that the left seems to have gotten a little...what's the word...crazy lately.