Draw a diagonal (line segment from one corner to the opposite corner). You'll form two right triangles with the diagonal being the hypotenuse of both triangles.
Use the Pythagorean Theorem: (leg)^2 + (leg)^2 = (hypotenuse)^2.
Let h represent the length of the hypotenuse.
The question is just a bit vague. It should say opposite corners. I'm assuming the question does not mean the length of one side!