which important document announce that the American colonies no longer wish to be part of the British Empire
2 answers:
The Declaration of Independence announced that the American colonies no longer wish to be part of the British Empire.
You might be interested in
Differences beetween the north and south. south was pro-slavery and North was anti-slavery
I believe it is appropriation
Answer:
The Senate
Explanation: