1. They became better educated.
2. They took jobs writing books.
3. They learned about freedom.
They backed up liberal reforms in the lands they conquered. :)
-The increase of immigrants
-The aftermaths of WWI
I really don't know if what I said is correct, but I just thought of it as of what happened before 1920
Britain had fought in the french and indian war
Florida had become a burden to Spain, which could not afford to send settlers or garrisons, so the Spanish government decided to cede the territory to the United States in exchange for settling the boundary dispute along the Sabine River in Spanish Texas. I hope that this is very helpful unlike the other rude person, have a nice day