A test driver has to drive a car on a 1-mile track for two rounds in a way, that his average speed is 60 mph. On his first round
his average speed is 40 mph. How fast does he need to drive, to get the right speed average?
1 answer:
to average 60 mph for 2 laps his total speed needs to equal 60 x 2 = 120
his first lap was 40 so his 2nd lap needs to be 120-40 = 80 mph
You might be interested in
A) 2(1+2c)
2+4c = 2+4c
B) 6(14r-2t)
= 84r-12t
This is your perfect answer
2.964 bolts
Because 7.8*.38=2.964
1 mile is equal to 5280 feet