Answer:
Average time taken to process each line is milliseconds
Step-by-step explanation:
A computer takes + 2 milliseconds to process a certain program.
If the program has 4 lines of static code and x variable lines, then total lines to process will be
⇒ Total lines = (x + 4)
Now average amount of time to process each line = (Total time to process a program) ÷ (Total lines to process)
Average time =
= milliseconds
So the answer is average time taken to process each line will be milliseconds.
Solution:
Average time per line = total time / total lines
Average time per line = (3x^2 + 2)/(x + 4)
Average time per line = 3x - 12 + (3x^2 + 2)/50 milliseconds
1/2 mile
346
i think
37 and 1 / 4 = 37 * 6 = 74 * 3 = 222
To solve this, you either need x or y. So x or y can be the missing values. The most you can do is simply move one of the terms to the other side.