<span>The French and Indian War changed the relationship between the British and the colonists. As a result of this war, the British got most of France's land east of the Mississippi River. ... The colonists wanted to go west to settle in the land Britain got from France. However, the Native Americans were threatening violence.</span>
Answer:
Britain ended its war with France; the British Navy no longer needed to seize American shipments or impress American sailors.
Europeans did not have the right to colonize other people. Although at the time they viewed themselves as the superiors of the world, colonizing other nations lead to many nations becoming similar to European countries and being forced to do labor rather than develop their own culture and system. Europeans could have achieved their goals of wealth and power without colonizing other people. This could've been achieved by creating alliances with nations they wished to colonize rather than completely taking them over. This would've lead to not only peaceful and larger relations but beneficial circumstances to both parties.
Hope this helps!