It will take 0.4 seconds to get to the home plate.
Given:
The speed of the pitch thrown by Justin is 100 miles per hour
The distance of home plate from the pitching rubber is 60.5 feet
To find:
The time taken by a pitch to reach the home plate
Solution:
The distance home plate from the pitching rubber = d = 60.5 feet

The speed of pitch thrown by Justin = s = 100 miles/hr
The time taken by a pitch to reach the home plate = t =?
The speed of the moving object is given by dividing the distance covered from the time taken to cover that distance.

In an hour there are 3600 seconds, then in 0.000114583 hours will be:

It will take 0.4 seconds to get to the home plate.
Learn more about conversions here:
brainly.com/question/24530464
brainly.com/question/17743460
I know its a long problem but i have to shiw how i got my answer.
Least to greatest would be 1/5 , 0.23 , and then 2.30%
Answer:
7 miles per hour
Step-by-step explanation:
3 1/2 miles = 7/2 miles
1/2 hours= 30 minutes
So,
In 30 mins, travelled 7/2 miles
In 1 min, travelled 7/2 ÷ 30= 7/2× 1/30 = (7×1)/(2×30) = 7/60 miles
In 60 mins, travelled (7×60)/60 miles = 420/60 miles = 7 miles.
Hence, the speed is 7 miles per hour.
Hope this helps :)