How have women right changes affected American society? Consider the family structure, economic health, and the strength of the
work force. Use specific moments of history as examples. In your opinion, has greater gender equity improved American society? Consider how the changing role of women has changed the identity of American society in Americans' eyes and the world's. (answer has to be at least a paragraph)
In the early 1900's women almost had no rights at all, from not being able to vote to not even to have day jobs along with their husband. On August 18,1920 all of that changed due to congress passing the 19th amendment, which granted the right of women to vote . The entire country changed that day women everywhere were overjoyed. Most men did not agree with the amendment however, they felt as if they were the dominant gender and that women were not supposed to have those rights. Now women make up about fifty percent of the votes which in the U.S. is about 162 million women voting.
From my knowledge, I know that the US Government believed Japanese Americans on the West Coast would provide a strategic advantage for country of Japan as it was their closest border to Japan. They could possibly be spies or seek out ways to help their home country they most likely had their allegiance to.
This was because the confederates were trying to push into Washington but the union was lucky enough to halt the advance, which if they couldn't then most likely Washington DC would've been in confederate hands. Not only that, but the battle also earned the title of being one of the most one-day bloodiest battles of the civil war.