A micrometer is capable of measuring the thickness of an object to units of one hundredth of a millimeter. Suppose an analog mic
rometer is used to measure the diameter of a ball bearing that is roughly 1.5 cm across. Select a reasonable value and error for the measurement of the bearing’s diameter. How did you choose the amount of error?
The options for the question aren't available. But it is given that the micrometer screw gauge in question can measure up to a hundredth of a millimeter.
We now want to measure a ball with diameter that is roughly 1.5 cm.
1.5 cm = 15.00 mm
Since our micrometer screw gauge can measure up to a hundredth of a millimeter, the possible true diameter of the ball will range between 14.995 mm and 15.004 mm
The error in measurement can range from -0.004 mm to +0.005 mm.
The amount of error was picked based on the approximate measurements on the analogy micrometer screw gauge that the equipment will still classify as roughly 15 mm (1.5 cm)