A computer takes 3x2 + 2 milliseconds to process a certain program. If the program has 4 lines of static code (this will always
be required for the code to run) and x variable lines, what is the average amount of time it takes to process each line?
2 answers:
Answer:
Average time taken to process each line is
milliseconds
Step-by-step explanation:
A computer takes
+ 2 milliseconds to process a certain program.
If the program has 4 lines of static code and x variable lines, then total lines to process will be
⇒ Total lines = (x + 4)
Now average amount of time to process each line = (Total time to process a program) ÷ (Total lines to process)
Average time = 
=
milliseconds
So the answer is average time taken to process each line will be
milliseconds.
Solution:
Average time per line = total time / total lines
Average time per line = (3x^2 + 2)/(x + 4)
Average time per line = 3x - 12 + (3x^2 + 2)/50 milliseconds
You might be interested in
0.2 if 1 Mile is 5 Minutes then every 0.1 mile is 30 seconds so 1 minute would be 0.2
Answer:
jgh
side side side axioms
not sure
Answer:
-1
Step-by-step explanation:
Given: Mean= 25 minutes.
Standard deviation= 2 minutes
x= 23 minutes.
Lets find the z-score for the number of sandwiches delivered in less than 23 mins.
Formula: Z-score= 
Z-score= 
⇒ Z-score= 
∴ Z-score will be -1
Hence, -1 is the z-score for the number of sandwiches delivered in less than 23 minutes.
Let
x = 3a (first angle)
y = 3a (second angle)
z = 6a (third angle)
The total angle of a triangle is ALWAYS 180.
3a + 3a + 6a = 180
12a = 180
a = 15
Therefore, x = 45, y = 45, z = 90
I hope this helps you
10^6=10.10.10.10.10.10=1000000
10^6=(2.5)^6=2^6.5^6