<span>The measure standard deviation quantifies the amount of variation or dispersion of a set of data values.</span>The standard deviation is the square root of the variance.
The standard deviation in this case is 50/ sqrt(100)= 50/10=5.
I used to look at the examples i can’t remember now what it is but look at the examples
The sum of six times a number & 5.
explanation: b is your variable which is going to by multiplied by 6, since there is no other symbol next to it. afterwards, you would add 5 to get your final answer.
There are 20 nickels and 10 pennies.
20 cents x 5 is $1.00 and 10 cents x 1 is 10 cents so it is $1.10 because a nickel is 5 cents so 20x5 is a dollar and a penny is worth 1 cent and 10 pennies is 10 cents so $1.00+$0.10=$1.10 brainliest is appreciated! :)