It has been proven that in the eyes of the british, the american colonies only existed to benefit britain economically.
I hope my straight forward explanation can answer your question :)
Answer:
They wanted to increase sales of their own products.
Explanation:
<span>The movement in literature and the arts that emphasized nature and emotion over reason</span>