Wow that's a hard one. Ummm....
Answer:
<h2>Sorry mate I am not able to understand ur question ❓❓❓❓ . sorry mate next time I can give u the answer.</h2>
50 miles = 1 hour
10 miles = 1/50 x 10
= 1/5 hours
= 12 minutes
It will take him 12 minutes to drive 10 miles.
Answer:
See below
Step-by-step explanation:
Y component of velocity is 70 sin 30°
y position = 3 + 70 sin 30° * t - 1/2 a t^2
when the ball hits the ground y = 0
0 = 3 + 70 sin 30° t - 1/2 (32.2)t^2
- 16.1 t^2 + 35t + 3 = 0
Use Quadratic Formula to find t = <u>2.26 seconds</u>
Horizontal component of initial velocity
70 cos 30° distance horizontal = 70 cos 30° * t
= 70 cos 30° (2.26) =<u> 137.0 ft</u>