Answer:
t = 6.2/s miles/minute
Step-by-step explanation:
Average speed is defined as the rate of change in distance of a body. Mathematically; speed = Distance/Time
Given the distance of the runner in miles to be d = 6.2miles
Time taken = t
Average speed = s
To express t in terms of the average speed s and distance of 6.2miles, we will substitute the values into the formula;
s = D/t
Substituting D = 6.2miles into the formula;
s = 6.2/t
Cross multiply
St = 6.2
Divide both sides by 's'
st/s = 6.2/s
t = 6.2/s
Hence, the equation that could be used to determine the time, t, it takes to run 10,000 meters as a function of the average speed, s, of the runner where t is in minutes and s in miles per minute is t = 6.2/s miles/minute
Answer is $244.50 since he anything over 40 hours is time and a half
I am not sure if I understand the question that clearly but if what I think is correct is that you need to prove that 2 and 1 are a system of equations for the following equations so this is how you do that:
With the coordinates (2,1) 2 will be x and 1 will be y because when dealing with coordinates (x,y) is how they are set up.
Next you substitute into the equations and solve:
(2) + 3(1) = 5 and 1 = -(2)+3
So now you find if the equations are true
2 + 3 = 5 so (2,1) is a solution for this equation
-2 + 3 = 1 so (2,1) is also a solution for this equation
So in conclusion (2,1) is a solution of the system of equations
Hope this helps :)