Answer:
When Americans started to explore the continent and make a country.
Explanation:
Manifest Destiny was when the Americans said it was Gods right given to them to make a country for their people.
Tiffany Stone
I learned about this last month in U.S History
field of critical theory that emerged in the early 1990s out of the fields of queer studies and women's studies