Women before WWII usually were mothers, cooked, and cared for their children. During the war, women started working in factories to replace the jobs of men that were sent to the war. After the war, a large percentage of women started working jobs instead of staying at home.
This depends on your definition of "friendly". Although there was plenty of animosity in the years after the war, Americans and British maintained a fairly cordial relationship because it was in the best interest of both countries to trade with one another. However tensions over land disputes and the British taking American sailors eventually led to the War of 1812.
It wasn’t recognized in the white law.
June 25th, 1950 I believe! :P