I must warn you I am not certain about this answer but this is the best I can come up with. Let's assume that the astronaut is 50 miles above the surface of the earth, due east of the centre of the Earth. He will therefore see the satellite when it is 50 miles above the surface of the earth, due north of the centre of the Earth. You can draw a triangle ABC where AB is the distance from the earth's centre to the astronaut, AC is the distance from the centre to the horizon, and BC is the distance from the astronaut to the horizon. The radius of the earth is 4000 miles, so the distance from the centre to the astronaut is 4050miles. This is the same as the distance from the centre to the horizon. This means AB and AC are 4050 miles. You can then use pythagoras' theorem to find out the length from the astronaut to the horizon:
<span>√(4050^2+4050^2)=5727.56
</span>So the distance is 5728 miles from the astronaut to the horizon.
Answer:
-6.07
Step-by-step explanation:
Answer:
0.0599
Step-by-step explanation:
There are four aces in a standard 52-card deck. If X=5, the first ace must be drawn exactly at the fifth pick, which means that the probability of X=5 is the probability of not getting an ace in the first four picks and getting an ace in the fifth pick:

The probability that X=5 is 0.0599
(1.)
Would be the correct answer because you can't add the two values together, they both need to have 'x' to be able to.