This statement is true. There are several different events that show America is far from isolationist during this roughly 160 year period. Here are some examples of major events that show America's lack of isolationism.
1) Manifest Destiny- During this era in American history (early to mid 19th century), the US gained thousands of miles in territory from countries like France (through the Louisiana Purchase) and Mexico (through the Mexican-American War and Mexican Cession).
2) Imperialism- The era of American imperialism (late 19th and early 20th century) resulted in the US expanding their territory, annexing places like Puerto Rico and Hawaii.
3) World War I (1917-1919)- The US got involved in World War I after Germany made several actions that upset the American government and citizens. This includes the sinking of the Lusitania and the Zimmerman Telegram.
They took the idea of Independence from Britain and evolved it into an idea of independence for slaves from their owners.
Hope this helps!
The greatest social shifts during this time in the United States were caused by the Second Great Awakening, which was a religious revival that sought in part to curb immigration. This of course led to calls for prohibition, because it was believed that mostly immigrants were the heavy drinkers.
Most of the people working before WW2 was males, with females doing all the house work, and told to "not get any outside jobs". However, during WW2, all the males were called to serve in the army, which left a lot of vacant space that needed to be filled. The women filled these positions, and took over producing war material. After WW2, some women returned to working at home, however many still wanted to work outside. This led to a shifting workforce of dominately males inside the workforce to more of a half-and-half.
hope this helps