The minutes it takes him to run 10.5 miles is 180 minutes
<h3>How many minutes did it take him to run 10.5 miles?</h3>
The given parameters are
Speed = 3.5 miles per hour
Distance = 10.5 miles
The time is calculated as:
Time = Distance/Speed
So, we have
Time = 10.5 miles/3.5 miles per hour
Evaluate the quotient
Time = 3 hours
Convert to minutes
Time = 180 minutes
Hence, the minutes it takes him to run 10.5 miles is 180 minutes
Read more about speed at:
brainly.com/question/6504879
#SPJ1
Answer:
2x + 3y +1
Step-by-step explanation:
-2x + 4x -6y + 9y +3 -2 (arranging)
2x + 3y + 1
Answer:
You go to whoever's question you wanna answer and it should say "add answer" and just type out your answer and solve it
To find the slope use the Rise over Run rule
It rises 2 units and runs 1 unit, so your slope is
2 or 2/1
To find the y-intercept just find where the line crosses the y-axis
it crosses at the -4 mark
Your slope is 2 and y-intercept is -4