To solve for the time an object takes to hit the ground, the equation
<em>y</em> <em>= y₀ + v₀t + ½gt²</em>, or just <em>0 = y₀ +½gt²</em> since there is no initial velocity and the final y position is 0 (since it is on the ground). Therefore, <em>t = (−2y₀/g)^½.</em>
(-2*60/-9.8)^½ = 3.50 seconds.
<em />
Answer:
hope it helps you see the attachment for further information
sorry but my phone couldn't captured r=2
Using Pythagoras’ theorem a2 + b2 = c2 you can find out the third (longest) side. (6x6) + (8x8) = 100 (the square root of 100 is 10) so ur final sum is 6+8+10 = 24 mm
Answer:
Distance = 500 miles
Step-by-step explanation:
<u>Step 1: Write the data of speed, distance and time for aeroplane and automobile.</u>
<em>Automobile:</em>
Speed = r
Time = 10 hours (3 am till 1 pm)
Distance = SxT
Distance = r x 10 = 10r
<em>Aeroplane:</em>
Speed = 200 + r (200 mph more than automobile)
Time = 2 hours ( 11 am till 1 pm)
Distance = SxT
Distance = 2x(200+r) = 400+2r
<u>Step 2: Equate both distances to find the value of r</u>
Distance of automobile = Distance of aeroplane
10r = 400 + 2r
8r = 400
r = 50
<u>Step 3: Find the distance using any one equation</u>
Distance of automobile = 10r
r = 50
Distance = 10 x 50 = 500 miles
Therefore, the plane travels 500 miles.
!!
Answer:
y = - 0.5
loco loco, loco loco, loco loco ooo