Many major-league baseball pitchers can throw the ball at 90 miles per hour. At that speed, how long does it take a pitch to tra vel from the pitcher’s mound to home plate, a distance of 60 feet 6 inches? Give your answer to the nearest hundredth of a second. There are 5280 feet in a mile and 12 inches in a foot.
2 answers:
To solve this problem, we must first convert the distance
into miles:
distance = 60 ft (1 mile / 5280 ft) + 6 inches (1 foot /
12 inches) (1 mile / 5280 ft)
distance = 0.0114583 mile
To calculate for time:
time = distance / speed
time = 0.0114583 mile / (90 miles / 3600 seconds)
time = 0.458 seconds
<span>time = 0.46 seconds</span>
Answer: It would take 0.46 second to travel from the pitcher's mound to home plate.
Step-by-step explanation:
Since we have given that
Speed at which baseball pitchers can throw the ball = 90 miles per hour
Distance covered = 60 feet 6 inches
As we know that
1 mile = 5280 feet
and 1 foot = 12 inches
6 inches =
So, total feet would be 60 feet +0.5 feet = 60.5 feet
Now,
1 feet =
So, 60.5 feet =
so, Time taken by pitch to travel from the pitcher's mound to home plate is given by
Hence, it would take 0.46 second to travel from the pitcher's mound to home plate.
You might be interested in
Answer:
6 blocks
Step-by-step explanation:
It is just doing 14-8 which would get you 6. So the distance is 6 blocks.
Hope that helps. Have a great night! Or day!
Answer:
58.955 is the right answer
Step-by-step explanation:
Answer:
y=3(x+1)^2+7
Step-by-step explanation:
You would write it as -509 over 100
Answer:
-2.25, 2.05, 21/10
Step-by-step explanation:
hope this helps if wrong plz tell me if right plz consider giving brainliest