Answer: by winning a global war against France
The British won a war against Spain and France which resulted in the treaty of 1763, which gave Britain both Canada and Florida, making it the dominant power in North America.
Source (transcript of a video included in my online school lessons):
<em>"Britain formally declared war on France in 1756, but the new commander in America, Lord Loudoun, fared no better than his predecessors. British fortunes improved when William Pitt became the new leader in London. With money he had borrowed, Pitt paid Prussia to wage war on France in Europe, and financed colonial militias in North America. The colonial troops won several important victories in 1758 and 1759. The most crucial came in September of 1760, when the British captured Montreal, forcing France out of Canada. After that, Spain and France joined forces to fight Britain in other parts of the world. A peace agreement was reached among the three countries in 1763. France lost all of Canada to Britain as well as New France from the British colonies west to the Mississippi River, but it retained its hold on the valuable West Indies, or “sugar islands.” Spain lost Florida to Britain while gaining Louisiana from France. Just thirteen years later, the British colonies would declare their independence."</em>