Answer:How did manifest destiny fundamentally change America?
Manifest Destiny, a phrase coined in 1845, expressed the philosophy that drove 19th-century U.S. territorial expansion. Manifest Destiny held that the United States was destined—by God, its advocates believed—to expand its dominion and spread democracy and capitalism across the entire North American continent.