John Colet lectured on the Epistles of Paul :)
When we won world war 1 it gave a huge boost in American pride but indeed America still wanted to focus on itself. Most Americans felt that Europe could rebuild itself (which was horribly wrong and set things up for the rise of Hitler and the start of world war 2.). The American people also felt that Americans should not have to fight and die in foreign wars and that since they were across the Atlantic they didn't need to worry about their affairs.
The answer would be B- English settlement of the world