The country that didn't control any territory in Africa was the United States. Option C. This is further explained below.
<h3>What is the United States?</h3>
Generally, The United States of America, often known as the United States of America, is a nation in North America.
In conclusion, the United States is the only major European power that does not have territorial holdings in Africa.
Read more about Nations
brainly.com/question/15115779
#SPJ1
No.
The Treaty of Versailles (1919) was a peace treaty signed by the European powers that officially ended World War I.
After six months of negotiations in Paris, the treaty was signed as a continuation of the November 1918 armistice in Compiègne, which had put an end to the clashes. The main point of the treaty required Germany to accept all responsibility for causing the war and, under the terms of articles 231-247, to make reparations to a number of nations of the Triple Entente.
Although the Versailles treatment was a good one, it was a way of blaming Germany and punishing it, but I don't think it's possible to say that there was peace when years later the World War II happened. They should have proposed an agreement between all countries and not just as a way of holding Germany alone.
Getting a job right after high school sounds like a good idea, however, you are limited to certain low-paying to medium-paying jobs, and it is mostly manual working jobs.
hope this helps
Answer: C The loss of troops convinced Lee and the Confederacy to never again invade the North.
Explanation: I had this question and got it right