The question is based on your own opinion,I cannot give you direct answer,but here's some idea for you.
yes: the war was caused by extreme nationalism,where the nations had the desire to be strong with and they seek for national glory. It started with this concept,they began to ally,to expand,soon running into conflicts,eventually the scale enlarged to a world war. Yet the chase for national glory was originally to bring a better nation,in the end it just cause loss and destruction. War is not necessary to be started in orfer to gain national glory. Were the nations to back down,it could be avoided. You might want to expand more on the destruction it caused if you choose yes.
No: In spite of WWI did cause a lot of destruction,it pushes the history of human foward. You could talk more on the things achieved and changed in WWI,such as the fall of empires,the uprise of women rights,the establishment of national cooperation system(League of nations), and the develope of more destructive weapons.
hope it helps!
The Harlem Renaissance was significant because it gave African American people and other people of African descent, a new and improved status in the society<span>. Because they made wonderful contributions of new kind of literature etc in the world.</span>
The United States gained Puerto Rico, the Phillippines, and Guam from Spain as a result of the Spanish-American War. They did not get Hawaii from their victory in the Spanish American War
Answer:
siege of savannah stamp act
Because they will only be in the state for a short period of time