Answer:1. Western Europe
2. Anglo-Saxons
3. pillage and gift
4. Pepin the Short
5. three
6. Vikings
7. bourgeoisie
8. Investiture
9. pope
10. scholasticism
Explanation:
11. Answers may vary. A sample answer is provided. England was a leader in literacy from its early history. Not only was literacy high in England, their monks also led literacy efforts in Europe. Literacy was key to developing its strong legal and taxation systems that would help the state become stronger.
12. Answers may vary. A sample answer is provided. Important intellectual ideas were developed from the twelfth to fifteenth centuries. During the Twelfth-Century Renaissance, scholars translated many Arabic texts into Latin, which made Islamic and ancient Greek scholarship and ideas accessible to the educated classes in Europe. In the thirteenth century, scholastics showed that theological questions could be discussed using logical reasoning. Finally, the European Renaissance introduced humanism, which began to challenge the central place of religious values in society. All three movements built on ideas from the ancient Greeks and Romans.
As we know....the Roman Empire was very, very successful.....but if we look closely; I believe it's quite easy to distinguish by which means the Roman Empire used to grow. Rome just loved to war aka conquest other countries...it would use its resources and force them to pay tributaries (payment to the person whom conquered them). They had an incredibly good army..their military techniques were incredibly smart..and they also had war machines; which greatly contributed to their power on the battlefield. As far as I know, Rome rarely made alliances and agreements...Christianity was after the fall of Rome.
This being said to expand and grow, it's quite obvious that the Roman Empire went on conquests and war to expand and thrive. Alas, all Empires fall...as did the might Roman Empire.
Thus, your answer.
Judaism is a West Asian belief
If both sides are getting what they want or it is even
The correct answer to this question is Japan bombed the Pearl Harbor. The time when Japan bombed the Pearl Harbor got the United States to finally enter the war. Although the war has begun with the Germany's invasion of Poland, the United States agreed to stay out of it. However, everything changed on December 7, 1941, a date that every American will never forget.