Answer:
420 miles.
Step-by-step explanation:
You would have to be infinitely far away from a sphere in order to see exactly 50% of its surface all at the same moment. Using simple geometry, you can prove that an observer that is a distance d away from the surface of a sphere with radius R can only see a percent area A of the sphere's surface as given by the equation:
A = 50%/(1+R/d)
Where A is the area seen,
R is the earth's radius 4000 miles
And D is the distance above the earth 200 miles
50% = 0.5 in fraction
Substituting values we have
A = 0.5(1 + 4000/200)
A = 0.5(1 + 20) = 0.5 x 21
A = 10.5%
10.5% of 4000 miles = 420 miles.
Answer:
x = (85 - 15)/7
Step-by-step explanation:
x = (85 - 15)/7
x = 10
<span>d. It does, the points shown on the line would be part of y = −2x.
is the answer let me know if this helped</span>
<span>The complete question includes these choices: A)1 foot B)2 feet C)5 feet D)9 feet The correct answer is B) 2 feet, approx. 24 inches; most standard desks have a height between 28'' to 30''. The answer A is too low, C and D are too much, the perfect answer would be at least 3-4 feet, but as this question proposes only these answers, the nearest estimating value is B)2 feet. </span>
Uhh 27 not really sure tbh