American nationalism is a form of nationalism found in the United States, which asserts that Americans are a nation and that promotes the cultural unity of Americans.[3]
American scholars such as Hans Kohn have claimed that the United States government institutionalized a civic nationalism based on legal and rational concepts of citizenship, and based on a common language and cultural traditions, rather than ethnic nationalism.[3] The founders of the United States founded the country upon classical liberal individualist principles rather than ethnic nationalist principles.[3] American nationalism sinceWorld War I and particularly since the 1960s has largely been based upon the civic nationalist culture of the country's founders.<span>[4]</span>
Hi... What are you asking about this article?
I think it’s D hope that helped you