A juggler throws a ball from an initial height of 4 feet with an initial vertical velocity of 30 feet per second, the height of
the ball can be modeled by h= -16t^2 +vt+s where t is the time in seconds the ball has been in the air, v is the initial vertical velocity, and s is the initial height. Write an equation that gives you the height of the feet of the ball as a function of time since it left the jugglers hand? Then calculate if the juggler misses the ball how many seconds does it take to hit the ground?