Answer:
The United States changed in many ways after World War II, such as by developing a greater desire for consumer goods, seeing itself and being seen by others as a superpower, and becoming more involved in European affairs.
Christianity is the correct answer :)
<span>Study ancient Roman culture to learn more about Roman attitudes toward violence and slavery :D </span>