Because they are important to the American people and symbolize America in a whole
It’s B ..................
Strengthened the legal protections for slavery
Answer:
I think nursing came to be a profession dominated by women because of many reasons. First during wars, men went off to fight. Who was left to care for the wounded soldiers where were male? Females. Second reason, I think it’s because women naturally seem to have more patience in the job of nursing. Most are mothers themselves and seem to best understand how things work.
Explanation:
Hope this helps.
The country that lost all of it's African colonies by 1920 was Germany<span />