England’s colonization of the Atlantic coast in the 17th century, which laid the foundation for the United States of America. The centuries following the European arrivals would see the culmination of this effort, as Americans pushed westward across the continent, enticed by the lure of riches, open land and a desire to fufill the nation’s manifest destiny