A computer takes 3x2 + 2 milliseconds to process a certain program. If the program has 4 lines of static code (this will always
be required for the code to run) and x variable lines, what is the average amount of time it takes to process each line?
2 answers:
Answer:
Average time taken to process each line is
milliseconds
Step-by-step explanation:
A computer takes
+ 2 milliseconds to process a certain program.
If the program has 4 lines of static code and x variable lines, then total lines to process will be
⇒ Total lines = (x + 4)
Now average amount of time to process each line = (Total time to process a program) ÷ (Total lines to process)
Average time = 
=
milliseconds
So the answer is average time taken to process each line will be
milliseconds.
Solution:
Average time per line = total time / total lines
Average time per line = (3x^2 + 2)/(x + 4)
Average time per line = 3x - 12 + (3x^2 + 2)/50 milliseconds
You might be interested in
Answer:
5.3 to the 4th power :)
Step-by-step explanation:
Answer: a
Step-by-step explanation:
1.5 * 3.785 * 1000 milliliters
Answer: 405 yd squared
Step-by-step explanation:
V = (bh)/2
b = (15 x 9)/2 = 67.5
67.5 x 12 = 810
810/ 2 = 405