The time it takes for climbers to reach the highest point of a mountain is normally distributed with a standard deviation of 0.7
5 hours. If a sample of 35 people is drawn randomly from the population, what would be the standard error of the mean of the sample? 0.02
0.13
0.17
0.40
0.76
The best approach to this is to use the formular Standard error = standard deviation / sqrt( sample size) standard deviation = 0.75 hours sample size = 35 Hence, Standard error = 0.75 /sqrt(35) = 0.126 = 0.13