Answer:
c sjsjejdjdjdjdjdididi C is the right I think
Politics of the Southern United States<span> (or </span>Southern politics<span>) refers to the political landscape of the </span>Southern United States<span>. Due to the region's unique cultural and historic heritage, the American South has been prominently involved in numerous political issues faced by the United States as a whole, including </span>States' rights<span>, </span>slavery<span>, </span>Reconstruction<span> and the </span>Civil Rights Movement<span>. The region was a "Solid South" voting heavily for Democratic candidates for president, and for state and local offices, from the 1870s to the 1960s. Its Congressmen gained seniority and controlled many committees. In presidential politics the South moved into the Republican camp in 1968 and ever since, with exceptions when the Democrats nominated a Southerner. Since the 1990s control of state and much local politics has turned Republican in every state.</span>
Hello!
I believe one effect of Great Depression on women was to strengthen the belief that a woman's place was in the home.
The new strategy was reinforced once Ulysses S. Grant took control of the Union Army and started taking more of an aggressive approach. He had the army start attacking in a more antagonistic fashion.