I believe the answer is A, the standard deviation is preferable to the range as a measure of variation because the standard deviation takes into account all of the observations, whereas the range considers only the largest and the smallest. Range gives an overall spread of data from the lowest to the largest and thus can be influenced by anomalies, standard deviation on the other hand, takes into account the variable data/spread about the mean and allows for statistical use so inferences can be made.<span />
Both standard deviation and range are used as measures of dispersion. These show the deviations of the data within.
While range is maximum -minimum, the std deviation is the square root of variance and variance is the average of the sum of squares of each entry from the mean.
While range depends only on max and min the std deviation depends on each entry of the sample. Hence std deviation is a better measure.
a. the standard deviation is preferable to the range as a measure of variation because the standard deviation takes into account all of the observations, whereas the range considers only the largest and smallest ones.