Answer:
<em>a. The rock takes 2.02 seconds to hit the ground</em>
<em>b. The rock lands at 20,2 m from the base of the cliff</em>
Explanation:
Horizontal motion occurs when an object is thrown horizontally with an initial speed v from a height h above the ground. When it happens, the object moves through a curved path determined by gravity until it hits the ground.
The time taken by the object to hit the ground is calculated by:

The range is defined as the maximum horizontal distance traveled by the object and it can be calculated as follows:

The man is standing on the edge of the h=20 m cliff and throws a rock with a horizontal speed of v=10 m/s.
a,
The time taken by the rock to reach the ground is:


t = 2.02 s
The rock takes 2.02 seconds to hit the ground
b.
The range is calculated now:

d = 20.2 m
The rock lands at 20,2 m from the base of the cliff
This would be true. On Jupiter you would weigh 234 pounds if you were 100 pounds on Earth.
The answer I'm going with is false
I would say that the answer is A.