Answer:
Provided that the sample size, n, is sufficiently large (greater than 30), the distribution of sample means selected from a population will have a normal distribution, according to the Central Limit Theorem.
Explanation:
1. As n increases, the sample mean approaches the population mean
(The Law of Large numbers)
2. The standard error of the sample is
σ/√n
where σ = population standard deviation.
As n increases, the standard error decreases, which means that the error
between the sample and population means decreases.
The difference would be subtracting the variables and the dividing the difference by 2
This is not possible. Why not? Because the smallest the variance can get is 0.
Recall that 's' represents the standard deviation, so s^2 is the variance. It basically measures how spread out the values are. The higher the variance, the more spread out the data. You can think of it as "average distance from the mean". If the variance is 0, then all of the values are at the same point. So you could have a list like {2,2,2,2,2} which has variance 0. We cannot get any smaller variance than that. If your teacher insists all the values in the list are different, then the variance will be greater than 0.