Although the war began with Nazi Germany's attack on Poland in September 1939, the United States did not enter the war until after the Japanese bombed the American fleet in Pearl Harbor, Hawaii, on December 7, 1941<span>.</span>
The american one stayed mainly between the U.S. and Great Britian, until the French joined. The french one was happening between the lower class french and the french government then i evoled like a pokemon into the french fighting athor monarchies in europe like austria and purssia. i really hope this helps you.
The United States became an empire in 1945. It is true that in the Spanish-American War, the United States intentionally took control of the Philippines and Cuba. It is also true that it began thinking of itself as an empire, but it really was not.
it gives the weakest support to the government, to the authority of the government and to its people