The colonies were founded by the United Kingdom. Most of the colonists had a positive attitude because of this. Many of their families lived in Britain at the time as well, but they also had come from Britain.
the Roosevelt family should be the answer
Answer:
b
Explanation:
Germany didn't increase worldwide dominance, if anything, they lost a lot of land after the First World War.