Real life problem for quadratic equation:
A ball is thrown into the air from the edge of a building, 50 feet above ground. It’s initial velocity is 20 feet per second. About how long does it take for the ball to hit the ground?
Sorry if this doesn’t help.
Answer: 30 mph is the answer.
Step-by-step explanation:
s = 3 miles
t = 6 minutes
so,
60 minute = 1 hour
1 minute = 1/60 hour
6 minutes = 1/60 * 6 = 0.1 hour
so
average speed = s/t
= 3/0.1 = 30 mph
3427 + 65.75s = 5000 + 35.5s
65.75s - 35.5s = 5000 - 3427
30.25s = 1573
s = 1573 / 30.25
s = 52 seconds <== they are the same at 52 seconds
3427 + 65.75(52) = 6846
5000 + 35.5(52) = 6846
and their altitude is 6846 <===
Answer: 180 degrees
Step-by-step explanation: I think