Once upon a time the Deep South of the United States voted Democrat in every election. Now it would seem that they have switched to predominantly Republican leanings. Is it because republican politicians started to appeal to those in the south that are considered racist whites? Everyone that knows their history will tell you the Democratic Party was the party of slavery and Jim Crow, and the Republican Party was the party of emancipation and racial integration. The southern democrats were the segregationists while the republicans were the tolerant northerners. However, it seems that in the 60’s and 70’s everything suddenly flipped. All of a sudden the republicans became the party accused of being racist and bigoted. What’s the truth? Why the change? The answers are found in this short educational video from Prager U.
EXCLUSIVE: General Mattis Planned Primary Run Against Trump, Pence Was Also Considered, Nikki Haley Was Tested As Running Mate
May 22nd, 2019 Big League Politics
May 23rd, 2019 Right Wing Tribune
May 22nd, 2019 Flag And Cross
May 23rd, 2019 Powdered Wig Society
Wait, you mean I don't have to get a sociopolitical lecture along with my caffeine and sugar dose? There's another way?!