Manifest Destiny was the idea that Americans were destined, by God, to govern the North American continent. This idea, with all the accompanying transformations of landscape, culture, and religious belief it implied, had deep roots in American culture.