World War I had a tremendous impact on women. They assumed many of the jobs left behind by men who were drafted to fight in the war. Women worked in all industries from farms to factories. Despite their prevalence in industry, they did not make wages comparable to men. After the war ended, many women returned to domestic roles previously occupied prior to the work. The war fostered a new sense of independence and responsibility for women.