Many people believe the United States should of helped Germany after World War One. After WW1 the Treaty of Versailles created a great depression in Germany, because of the aggressive reparations. Many historians believe that the end of WW1 caused World War 2.
Answer: No. There was no need to declare war, as this power had yet to be established.
Explanation: