Answer:
Gradually, American society came to accept that girls could be educated and that women could be <u>TEACHERS</u>
Explanation:
As early as the 19th Century, most men and women lived by their traditional roles in the society. Men would work and be the breadwinners in America, while the Women would be the 'home maker' who would take care of the kids, clean the house, cook and do the dishes.
In such traditional roles, it was not normal for a woman to be highly educated and go out to work.
However, by the early-20th Century, things had started to gradually change. With the on-set of the first and the second world war, more and more women, left their homes to take up jobs.
In the early years of women rights though, Conservative Americans found it hard for girls to be educated and to become professional teachers.
<em />It really depends in what you believe in,But scientifically the earth was made my the Big Bang which back then earths crust was really hot but over time it cooled
to form the hard crust we have now.Animals were formed by what we believe to be specimens falling and evolving over time.
The answer is A to create more educator voters
The Cold War, a "war" of attrition and nuclear build-up between the United States and the Soviet Union, greatly shaped American culture by the people's emphasis on space and science to beat out the Soviets.