Answer:
Hope this helps!
Step-by-step explanation:
For every
, we have

So, negletting the first term, each term of the sum is greater than 1, which implies

Which means that the series diverges.
Using the mean concept, it is found that on average, it takes him 11.9 seconds to run a 100 yard dash.
<h3>What is the mean?</h3>
The mean of a data-set is given by the <u>sum of all observations in the data-set divided by the number of observations</u>.
In this problem, the three observations are given by:
12.7 s, 11.2 s, 11.8 s.
Hence the mean is:
M = (12.7 + 11.2 + 11.8)/3 = 11.9.
On average, it takes him 11.9 seconds to run a 100 yard dash.
More can be learned about the mean of a data-set at brainly.com/question/24628525
#SPJ1
<span>Im pretty sure that it's 0
</span>
Answer:
sir please give the exact question
Step-by-step explanation:
That what we have to do with these three lines..