Many major-league baseball pitchers can throw the ball at 90 miles per hour. At that speed, how long does it take a pitch to tra vel from the pitcher’s mound to home plate, a distance of 60 feet 6 inches? Give your answer to the nearest hundredth of a second. There are 5280 feet in a mile and 12 inches in a foot.
2 answers:
To solve this problem, we must first convert the distance
into miles:
distance = 60 ft (1 mile / 5280 ft) + 6 inches (1 foot /
12 inches) (1 mile / 5280 ft)
distance = 0.0114583 mile
To calculate for time:
time = distance / speed
time = 0.0114583 mile / (90 miles / 3600 seconds)
time = 0.458 seconds
<span>time = 0.46 seconds</span>
Answer: It would take 0.46 second to travel from the pitcher's mound to home plate.
Step-by-step explanation:
Since we have given that
Speed at which baseball pitchers can throw the ball = 90 miles per hour
Distance covered = 60 feet 6 inches
As we know that
1 mile = 5280 feet
and 1 foot = 12 inches
6 inches =
So, total feet would be 60 feet +0.5 feet = 60.5 feet
Now,
1 feet =
So, 60.5 feet =
so, Time taken by pitch to travel from the pitcher's mound to home plate is given by
Hence, it would take 0.46 second to travel from the pitcher's mound to home plate.
You might be interested in
Answer:
x=2.2
Step-by-step explanation:
6X - 2 + 5x + 7
add like terms
11x+5
divide 5 on both sides
x=2.2
Hope this helps dont 4get to like and star
-mercury
Answer:
17. +5-2-1=+2
18. -7-6-2=15
19. +11-9-4=-2
20.+4-8+6=+2
Answer:
1/4 is your answer
Step-by-step explanation:
Solve the second equation for y so that you can substitute it in for y in the first equation.. y = 5x - 4 Substitute: 3x + (5x - 4) = 1
Number 7 is a because line m crosses at 2 on the y axis and number 8 is a because it goes up by one half each time