Answer:
t = 6.15 seconds
Step-by-step explanation:
A ball is thrown upward with an initial velocity of 35 meters per second from a cliff that is 30 meters high. The height of the ball is given by the quadratic equation h=-4.9t^2+35t+30 where h is in meters and t in the time in seconds since the ball was thrown, find the time it takes the ball to hit the ground. Round you answer to the nearest tenth of a second.
-----------------
h(t) =-4.9t^2+35t+30
----
When the ball hits the ground its height is zero.
So, solve -4.9t^2+35t+30 = 0
Use the quadratic formula:
t = [-35 +- sqrt(35^2-4*-4.9*30)]/(2(-4.9))
----
t = [-35 +- sqrt(637)]/(-9.8)
---
To get a positive solution:
t = [-35-25.24]/(-9.8)
t = 6.15 seconds
========================