Answer: -14
Step-by-step explanation:
50 miles = 1 hour
10 miles = 1/50 x 10
= 1/5 hours
= 12 minutes
It will take him 12 minutes to drive 10 miles.
Answer:
Step-by-step explanation:
first you multiply 45 by .40 you multiply by .40 because that is the percent increase.
45*.40=18
then you just add the 18 to 45 which get you 63.
answer:40%
18.5 hours
To solve this problem, you first need to figure out the average amount of money per hour the worker earns.
That would be the base salary plus the average tips per hour. So
$6 + $12 = $18
Then to figure out how many hours the worker needs to work, divide the goal by the hourly earnings. So
$333 / $18 = 18.5 hours.
Therefore on average, it will take 18.5 hours to earn $333, assuming a
base salary of $6/hour and an average of $12 in tips per hour.
y/x = k
6.4/4 = 1.6
11.2 / 7 =1.6
16/10 = 1.6
20.8 /13 =1.6
This is a direct variation and the constant is 1.6
y=1.6x