Women gained several important right in America like Right to vote, sexual right, To hold public office , Right to choose, Equal pay, Enter into legal contract and several other.
Women and women's organizations fought for social changes, political and economic equality on a large scale, and the ability to vote in the late 1800s and early 1900s. Women's employment in the United States increased from 2.6 million to 7.8 million between 1880 and 1910.
They now have rights, such as the freedom from abuse and discrimination, the best possible level of bodily and mental health, education, property ownership, the ability to vote, and equal pay. However, many women and girls still experience sexism and gender discrimination on a global scale.
To learn more about women's right here
brainly.com/question/13662563
#SPJ4
Wait ill brb hold up ill comment on this t tell u
Answer:
because of the idea painted in most peoples minds that black men are always affiliated with gangs and other bad stuff so they must be dangerous and are a problem
Explanation:
Answer:
A nativist would define a "real" American as a native-born, white citizen from America. ... A nativist would define a "real" American as a native-born, white citizen from America
Explanation:
Um the full question is not here so idk