Conversion: 1 mile = 1,760 yards
Multiply.
2.5 * 1760 = 4,400
Therefore, there are 4,400 yards in 2.5 miles.
Best of Luck!
Since this line is written in point slope form, we have y-y1=m(x-x1) so our m is our slope. Here, m=3 so our slope here is 3
Answer:
His time would be least likely greater than 27.9 minutes or smaller than 21.5 minutes.
Step-by-step explanation:
With an standard deviation of 3.2 minutes, it is clear that the maximum time taken by Roman to complete his run of 3 miles would be
Average time + standard deviation = 24.7 minutes + 3.2 minutes = 27.9 minutes
The minimum time taken by Roman to complete his run of 3 miles would be
Average time - standard deviation = 24.7 minutes - 3.2 minutes = 21.5 minutes
Hence, his time would be least likely greater than 27.9 minutes or smaller than 21.5 minutes.
Answer:
Denoting 2 desired numbers: x and y, then:
x + y = 17
2x = 3y + 4
<=>
x = 17 - y
2*(17 - y) = 3y + 4
<=>
x = 17 - y
34 - 2y = 3y + 4
<=>
x = 17 - y
5y = 30
<=>
y = 6
x = 17 - 6 = 11
Hope this helps!
:)