Answer:
Gradually, American society came to accept that girls could be educated and that women could be <u>TEACHERS</u>
Explanation:
As early as the 19th Century, most men and women lived by their traditional roles in the society. Men would work and be the breadwinners in America, while the Women would be the 'home maker' who would take care of the kids, clean the house, cook and do the dishes.
In such traditional roles, it was not normal for a woman to be highly educated and go out to work.
However, by the early-20th Century, things had started to gradually change. With the on-set of the first and the second world war, more and more women, left their homes to take up jobs.
In the early years of women rights though, Conservative Americans found it hard for girls to be educated and to become professional teachers.
Arkansas, Colorado,Louisiana,Montana,New Mexico, North Dakota,Oklahoma,South Dakota, Texas, Utah and Wyoming..
Hope that helps