Answer:
The United States accepted its role as a world power after World War II, helping to rebuild Europe and Japan and taking the leading role in establishing the United Nations
Economic and political factors, the U.S. was influenced to practice imperialism, particularly during the late 19th and early 20th centuries.