Answer:
As the difference between the resistance of voltmeter and the resistance being measured gets reduced the error in the reading of the voltmeter gets increased.
Explanation:
An ideal voltmeter has infinite parallel resistance and because of this it doesn't draw any current from the circuit of measurement which means it will measure the exact voltage across the elements.
But practically speaking, a real voltmeter doesn't has infinite resistance therefore, all the practical voltmeters face loading effect to some extent.
As the difference between the resistance of voltmeter and the resistance being measured gets reduced the error in the reading of the voltmeter gets increased. This is why we want to have a greater value of voltmeter resistance, ideally infinite so that the corresponding error is minimized.
Lets consider the given scenario,
A voltmeter has 1 MΩ parallel resistance and the resistance of of measuring element is 500 kΩ or 0.5 MΩ
lets suppose the supplied voltage is 1 V.
First lets assume that the voltmeter is ideal and it has infinite resistance, so in this case voltmeter will measure a voltage of 1 V across the 0.5 MΩ resistor.
Now consider the loading effect, when we connect the voltmeter across the 0.5 MΩ resistor they both become parallel so the resistance is
R = (1*0.5)/(1+0.5)
R = 0.33 MΩ
As you can see the voltmeter will see a reduced resistance and the corresponding voltage also reduces because resistance and voltage are directly proportional.
Therefore, it is preferred to have a very high parallel resistance of the voltmeter.