American nationalism is a form of nationalism found in the United States, which asserts that Americans are a nation and that promotes the cultural unity of Americans.[3]
American scholars such as Hans Kohn have claimed that the United States government institutionalized a civic nationalism based on legal and rational concepts of citizenship, and based on a common language and cultural traditions, rather than ethnic nationalism.[3] The founders of the United States founded the country upon classical liberal individualist principles rather than ethnic nationalist principles.[3] American nationalism sinceWorld War I and particularly since the 1960s has largely been based upon the civic nationalist culture of the country's founders.<span>[4]</span>
Deberían ser una imagen adjunta a tu pregunta o no?
The main motive for the French wanting to establish colonies around the world was to compete with the British and the Dutch, who were extracting great amounts of resources from new territory.