Many people believe the United States should of helped Germany after World War One. After WW1 the Treaty of Versailles created a great depression in Germany, because of the aggressive reparations. Many historians believe that the end of WW1 caused World War 2.
The answer is <span>Confederacy</span>
By means of excavations and the study of the areas that lived in as well as the careful study of there remains to study there health and if there where identifiable diseases that they suffered and other things of this nature <span />
so that the gods didnt get mad at them
1 thats what i got on my test