Answer : Subtract
Hope it help you
Please mark me brianliest
Answer:
1/2
Step-by-step explanation:
1/3 + 1/6 = 1/2
Answer:
t = 5 seconds
Step-by-step explanation:
It is given that, the height h, in feet of a ball above the ground is given by :

Where t is in seconds
We need to find how long does it take the ball to hit the ground? When it hits the ground, its height will be equal to 0. So,

Neglecting negative time.
It means ball will take 5 seconds to hit the ground.
To answer this, you will use the area of 900 square yards to determine the distances between the bases. Each side of the square is 30 yards, so it will be 30 yards from 1st to home and from 1st to 2nd.
The distance from home to 2nd is a diagonal in the square (the hypotenuse).
You will use the Pythagorean Theorem to find this distance.
a^2 + b^2 = c^2
30^2 + 30 ^2 = c^2
900 + 900 + c^2
1800 = c^2
The square root of 1800 is approximately 42.4 yards.
The ball travels approximately 42.4 yards.