The main event that led the United State to take a more active role in the world affairs was World War II, since the US emerged from this war an economic superpower. It was also clear that some nation needed to "fill the void" left by the relative fall of the British Empire.