At a speed of 40 mph, it would take 15 minutes to drive to the mall
The first step to take in order to determine the answer is to determine the distance driven to the mall. This information is not given so it has to determined.
Distance = speed x total time
- speed = 30 mph.
- time = 20 minutes
Speed is measured in hours, while time is measured in minutes. Both data has to be converted to the same time measurement
To convert time to hour, divide by 60 = 20 / 60 = 1/3
Distance = 1/3 x 30 = 10 m
The second step is to use the distance determined in the previous step to determine the time it would take to travel at a speed of 40 mph
Time = Distance / Average Speed
10 / 40 = 1/4 hours
To convert hours to minutes, multiply by 60 :1/4 x 60 = 15 minutes
In order to determine the time that it would take to drive to the mall, first determine the distance to the mall. Use the distance and speed to determine the time
A similar question was solved here: brainly.com/question/14391898?referrer=searchResults
Yea thats the coordinate plane good job
A CD Would Be The Most Reasonable Answer .
Answer:
171
Step-by-step explanation:
First you plug -4 into the g(x) equation
g(-4)= 1-3(-4)
Use pemdas
g(-4)= 1+12
g(-4)= 13
Next you plug 13 into the f(x) equation
f(13)= 13^2 + 2
First you do 13^2=169
Add 2 and you get 171