They allowed for much stronger and more durable weapons.
Answer:
Because of the political setting of most western nation's
Germany after WWI didn't have a lot of pride in their nation
(nationalism) and Hitler promised to restore Germany to its former
glory
More control over the family businesses: The Civil War took men away from farms, businesses, and plantations. Women were expected to take control of matters in their absence.
The war opened new economic opportunities as women were provided a chance to run family businesses. This was especially true for women in the South who had to run plantations while their husbands fought or were involved in the government.