I must warn you I am not certain about this answer but this is the best I can come up with. Let's assume that the astronaut is 50 miles above the surface of the earth, due east of the centre of the Earth. He will therefore see the satellite when it is 50 miles above the surface of the earth, due north of the centre of the Earth. You can draw a triangle ABC where AB is the distance from the earth's centre to the astronaut, AC is the distance from the centre to the horizon, and BC is the distance from the astronaut to the horizon. The radius of the earth is 4000 miles, so the distance from the centre to the astronaut is 4050miles. This is the same as the distance from the centre to the horizon. This means AB and AC are 4050 miles. You can then use pythagoras' theorem to find out the length from the astronaut to the horizon:
<span>√(4050^2+4050^2)=5727.56
</span>So the distance is 5728 miles from the astronaut to the horizon.
36 in. = 1 yard
Simply multiply 36 x 50 to get the total amount of yards.
1800 yards!
Answer:
download photo math
Step-by-step explanation:
well just go to ur app store and download this app called photo math
It can be, but it depends on the last number. if its even the number is then even.
The answer is D because the median is 4, the mean is 4, and the range is 4