Democratic dominance of the South originated in the struggle of white Southerners during and after Reconstruction (1865–1877) to reestablish white supremacy and disenfranchise blacks. The U.S. government under the Republican Party had defeated the Confederacy, abolished slavery, and enfranchised blacks.
I had this question on one of my tests, and the right answer was: <span>Colonial women had social and political influence, despite having few legal rights. I am 100% positive this is the right answer. :)</span>