<span>The role of women (specifically white women)
changed dramatically during this time in the United States because
production moved largely out of the house and into factories, meaning
women stopped being producers and became consumers. Socially, this meant
that women gained slightly more independence. </span>
When soldiers went away to war their jobs needed to be replaced to keep the economy stable, women were able to take this opportunity and work in factories for munitions and sewing and many other things, this was a big step in the right direction to gender equality, after WW1 people started to change their attitude and realised that women can work and can do just as good as men in some countries they were given the vote and more opportunities opened for women in different industries and they were able to make a living for themselves instead of being reliant on a husband to get money for their whole family
I think this is the correct answer, but let me know if it is incorrect. People were having more children per family.