The minutes it takes him to run 10.5 miles is 180 minutes
<h3>How many minutes did it take him to run 10.5 miles?</h3>
The given parameters are
Speed = 3.5 miles per hour
Distance = 10.5 miles
The time is calculated as:
Time = Distance/Speed
So, we have
Time = 10.5 miles/3.5 miles per hour
Evaluate the quotient
Time = 3 hours
Convert to minutes
Time = 180 minutes
Hence, the minutes it takes him to run 10.5 miles is 180 minutes
Read more about speed at:
brainly.com/question/6504879
#SPJ1
To find the slope take your points, they should be (x1,y1) (x2,y2), proceed to subtract y2-y1 over x2-x1. after you calculate that simplify and put it in y=mx+B (the slope goes in the m position and x is usually left open)
Answer:
r = 20
Step-by-step explanation:
-4 = r/20 - 5
-4 + 5 = r/20 - 5 + 5
1 = r/20
1 * 20 = r/20 * 20
r = 20
Answer:
-8
Step-by-step explanation:
you add 5 to -7 to get -2
7 x -1 is -7
so x = -1
8 x -1 = -8
Answer:
The speed will be "144 mph".
Step-by-step explanation:
Let the speed of the plane be "x mph".
then,
The car's speed be "x-130 mph".
According to the question,
⇒ 
By applying cross-multiplication, we get
⇒ 
⇒ 
⇒ 
⇒ 
⇒ 
⇒