The time taken for radio signal to the earth's surface we shall proceed as follows;
time=(distance)/(speed)
Distance=7.5×10^6 meters
speed=3×10^8 meters/second
therefore:
Time=(7.5×10^6)/(3×10^8)=0.025 seconds
We conclude that it will take the signal 0.025 seconds to reach the surface of the earth.
Answer:
y = x - 3
<h3>How I know I'm correct:</h3>
if you plug in the numbers from the table, it checks out.
y = x - 3
0 = 3 - 3
0 = 0
y = x - 3
1 = 4 - 3
1 = 1
y = x - 3
2 = 5 - 3
2 = 2
and so on
By definition, we have the following equation:
d = v * t
Where,
d = Distance
v = rate
t = time
The total distance traveled will be:
45t + 60 (8 - t) = 405
Solving for the time we have:
t = 5 s
Answer:
he drives at 45 mph for 5 s