Using the mean concept, it is found that on average, it takes him 11.9 seconds to run a 100 yard dash.
The mean of a data-set is given by the <u>sum of all observations in the data-set divided by the number of observations</u>.
In this problem, the three observations are given by:
12.7 s, 11.2 s, 11.8 s.
Hence the mean is:
M = (12.7 + 11.2 + 11.8)/3 = 11.9.
On average, it takes him 11.9 seconds to run a 100 yard dash.
More can be learned about the mean of a data-set at brainly.com/question/24628525
#SPJ1
6(y+3) is the answer. Factor out using the GCF of 18 and 6
Answer:
2/5
Step-by-step explanation:
these are the answers you need and you'll need it anyways yw