Answer:
Step-by-step explanation:
The question is incomplete. Here is the complete question
On a flight New York to London an airplane travels at a constant speed. An equation relating the distance traveled in miles d to the number of hours flying t is t= 1/500d. How long will it take the airplane to travel 800 miles?
Speed is the change in distance if a body with respect to time. It is expressed mathematically according to the question as t = 1/500 × d
To determine the time it will take to travel 800miles, we will substitute d = 800 into the modeled equation.
t = 1/500 × 800
t = 800/500
t = 8/5
t = 1.6
Answer:
-19 = x
Step-by-step explanation:
Step 1: Write equation
-4(x + 1) - 3 = -3(x - 4)
Step 2: Solve for <em>x</em>
<u>Distribute:</u> -4x - 4 - 3 = -3x + 12
<u>Combine like terms:</u> -4x - 7 = -3x + 12
<u>Add 4x on both sides:</u> -7 = x + 12
<u>Subtract 12 on both sides:</u> -19 = x
Answer:
4.8 miles per hour.
Step-by-step explanation:
Average speed=Total distance/Time taken
=26.2miles/5.5hour
=4.8 miles per hour