35 miles is equal to 35*5280, which is 184,800.
Now we know that it takes one hour to travel 184,800.
The next step is to divide 184,800 by 60, which is 3,080.
So it takes one minute to travel 3,080 feet. Hope this helps.
The answer is No solution
Answer:
131.3 miles
Step-by-step explanation:
The two cars are moving from different directions. The total distance between the two cars = 118 miles + 256 miles = 374 miles.
Let us assume that the two cars meet at point O, let the distance between car c and O be d₁, the distance between car d and point O be d₂, hence:
d₁ + d₂ = 374 miles (1)
Let speed of car d be x mph, therefore speed of car c = 2x mph (twice of car d). If it take the cars t hours to meet at the same point, hence
For car c:
2x = d₁/t
t = d₁ / 2x
For car d;
x = d₂/t
t = d₂/ x
Since it takes both cars the same time to meet at the same point, therefore:
d₁/2x = d₂ / x
d₁ = 2d₂
d₁ - 2d₂ = 0 (2)
Solving equation 1 and 2 simultaneously gives d₁ = 249.3 miles, d₂ = 124.7 miles
Therefore the distance from point of meet to Boston = 249.3 - 118 = 131.3 miles
The remainder to this question is 42
David will finish first, because if you divide Brad's miles by 2 to his miles in a 1/4 of an hour are 5.95 miles. This means David went 6.2 miles in the same time that Brad went 5.95. Therefore, David will finish first.