American media's role in the emergence of the United States as a world power was that of transmiting the American culture, creating a situation of cultural colonialism and expansionism that changed the American image in the world.
Examples of this situation can be seen in the huge American film industry, which reaches the entire world and provides an American vision of the things and themes it portrays.