Here is you answer hope it helps
Use quillbot for this Lol
The british were arming them and the americans were stealing their land
The first World War ended with the Treaty of Versailles, which ruined all of what Germany had gained from the war. Included was that Germany couldn't have an army of any great size nor power, large portions of the territory gained was taken, it forced Germany to pay for the damages which were caused, and to admit that the war itself was Germany's fault. Each part of this Treaty angered Germany. So when the Nazi's became popular, it appeared that the Nazi's (who were fascist) might be able to abrogate the treaty and Germany to what they thought was its rightful place as a major power, the Germans supported them.
<span>Italy and Germany became nation states because of nationalism among their people.
Or maybe due to World War 2 but world war 2 occured in 20th century </span>