The United States. As the main cause of this war was the westward expansion of the United States. That American ideal of “Manifest Destiny.” That it was their god given right to expand westward. At the time before the period of the Mexican-American war, people believed they could conquer people already living on the land and take it for the United States.
One of many results of Japan's invasion of Korea in the 1500s was "gradual opening of Japanese culture to other influences," since this ended a period of relative Japanese isolation.
The United States tried very hard not to be involved in the war. At first all they did was help out their allies with food and other supplies. They didn't actively join the war until December 7, 1941 when Japan bombed pearl harbor. They officially declared war against Japan and soon after Germany.
The 9th Amendment is the answer
Sorry i am not sure but i think Germany