By settling newly discovered lands
i hope this helps
have a wonderful day
It would be industrialization that changed American culture drastically. This is also related to westward expansion, and in many ways led to the conflicts that brought about Civil War.
The Indian Removal Act allowed the government to control most of the Indians' life such as land, resources and etc.
Racism-
discrimination based on a belief that some
"races" are innately superior to others
hopefully this is what you were looking for :')