D) 96 (the circle graph counts it out for you) hope this helps!!
Answer:
t = 6.2/s miles/minute
Step-by-step explanation:
Average speed is defined as the rate of change in distance of a body. Mathematically; speed = Distance/Time
Given the distance of the runner in miles to be d = 6.2miles
Time taken = t
Average speed = s
To express t in terms of the average speed s and distance of 6.2miles, we will substitute the values into the formula;
s = D/t
Substituting D = 6.2miles into the formula;
s = 6.2/t
Cross multiply
St = 6.2
Divide both sides by 's'
st/s = 6.2/s
t = 6.2/s
Hence, the equation that could be used to determine the time, t, it takes to run 10,000 meters as a function of the average speed, s, of the runner where t is in minutes and s in miles per minute is t = 6.2/s miles/minute
Expand 8(5-r)/5=-2r as indicated: 40 - 8r
----------- = - 2r
5
Multiply both sides by 5, obtaining 40 - 8r = - 10r
Combine like terms: 40 = - 2r
Divide both sides by -2: -20 = r
The solution to 8(5-r)/5=-2r is -20. Verify this by subbing -20 for r in the given equation.
3(-4+8) distribute the 3 to -4 and 8
(-12 + 24) add the terms
12
Answer:
meeeeeeeeeeee
Step-by-step explanation:
huggs