American nationalism is a form of nationalism found in the United States, which asserts that Americans are a nation and that promotes the cultural unity of Americans.[3]
American scholars such as Hans Kohn have claimed that the United States government institutionalized a civic nationalism based on legal and rational concepts of citizenship, and based on a common language and cultural traditions, rather than ethnic nationalism.[3] The founders of the United States founded the country upon classical liberal individualist principles rather than ethnic nationalist principles.[3] American nationalism sinceWorld War I and particularly since the 1960s has largely been based upon the civic nationalist culture of the country's founders.<span>[4]</span>
Answer:
I'm not sure eexactly wat you are looking for here but in the 1800 many cities were being made up of immigrants, fleeing to America for a better life. They changed the demographic of these Northern Cities, and played a great part in making America what it is today.
Explanation: