Imperialism is when a mother country takes over a smaller nation or colony for political, social, and/or economic reasons. Imperialism has been a major force in shaping the modern world. The effects of Imperialism have been interpreted from a variety of viewpoints. This major imperialism occurred during the late 19th Century and early 20th century. It had more negative effects in the modern world today then positive effects.
"The main way in which the United States developed politically during this time was that the Republican Party formed as an anti-slavery platform, while economically the nation flourished due to the create of railroads."
Original answer from: HIstoryGuy