World War I certainly influenced the opinion of Americans about many aspects of the future. The U.S for instance begun to isolate itself from the world and became increasingly unwilling to interfere in any external disputes. However, it also accelerated economic growth and changed attitudes on issues like the rights of women.