Measure of variability shows the extent in which data in a statically distribution differ from an average value as well as how they differ from each other. It shows how much the data falls from the center value (mean) showing the width of the distribution. The most used measures of variability are the range, interquartile range, variance, and standard deviation depending on the data to be measured.
To solve this you simply divide .505 by 1 (0.505/1) and then multiply by 1000/1000 to get rid of the decimal, giving you 505/1000. If you want to return to decimal form you divide 505 by 1000.