Given the radius of 50 miles and the line joining the cities at (0, 56) and (58, 0), the transmitter signal can be picked during 59.24 miles of the drive.
<h3>How can the duration of signal reception be found?</h3>
Radius of broadcast of the transmitter = 50 miles
Location of starting point = 56 miles north of the transmitter
Location of destination city = 58 miles east of the transmitter
Therefore we have;
Slope of the line joining the two cities
= 56 ÷ (-58) = -0.966
Which gives the equation of the line as follows;
y = -0.966•x + 56
The equation of the circle is;
1.933156•x^2 - 108.192•x + 636 = 0
Which gives;
Therefore;
When x = 6.67, we have;
- y = -0.966 × 6.67 + 56 = 49.56
When x = 49.29, we have;
- y = -0.966 × 49.29 + 56 = 8.4
The length of the drive, during which the driver can pick the signal, <em>l</em>, is therefore;
l = √((49.56-8.4)^2 + (49.29-6.67)^2) = <u>59.24 miles</u>
- The length of the drive during which the signal is received is 59.24 miles
Learn more about the equation of a circle here:
brainly.com/question/502872
#SPJ1