Given the radius of 50 miles and the line joining the cities at (0, 56) and (58, 0), the transmitter signal can be picked during 59.24 miles of the drive.
<h3>How can the duration of signal reception be found?</h3>
Radius of broadcast of the transmitter = 50 miles
Location of starting point = 56 miles north of the transmitter
Location of destination city = 58 miles east of the transmitter
Therefore we have;
Slope of the line joining the two cities
= 56 ÷ (-58) = -0.966
Which gives the equation of the line as follows;
y = -0.966•x + 56
The equation of the circle is;


1.933156•x^2 - 108.192•x + 636 = 0
Which gives;
Therefore;
When x = 6.67, we have;
- y = -0.966 × 6.67 + 56 = 49.56
When x = 49.29, we have;
- y = -0.966 × 49.29 + 56 = 8.4
The length of the drive, during which the driver can pick the signal, <em>l</em>, is therefore;
l = √((49.56-8.4)^2 + (49.29-6.67)^2) = <u>59.24 miles</u>
- The length of the drive during which the signal is received is 59.24 miles
Learn more about the equation of a circle here:
brainly.com/question/502872
#SPJ1
A. X Z Y
B. P N M
C. F E H G D
Answer:

Step-by-step explanation:
You can us the equation y = m x + c to answer this question.
In ,
y = m x + c
m = slope
c = y - intercept
So, they have already given in the question that,
m = 3/4
c = (-8)
Therefore, you can simply put 3/4 and (-8) instead of m and c respectively like this.
y = m x + c
y = 3/4x - 8
Hope this helps you.
Let me know if you have any other questions :-)
Answer:
(3,3)
Step-by-step explanation:
Using the midpoint formula:

Hope this helps!
Answer:
x = 5.6
Step-by-step explanation:
5x - 7 = 21
5x = 28
x = 5.6
Can I geet brainliest