Answer:
Check the explanation
Explanation:
1) f(n) = O( 1 ), since the loops runs a constant number of times independent of any input size
there is no critical section in the code, as a critical section is some part of code which is shared by multiple threads or even processes to modify any shared variable.This code does not contain any variable which can be shared.
2) f(n) = O( log n! ), the outer loop runs for n times, and the inner loop runs log k times when i = k,ie the total number of print will be – log 1 + log2 +log3 +log4+…...+ log n = log (1 . 2 . 3 . 4 . ……. . n ) =log n!
there is no critical section in the code, as a critical section is some part of code which is shared by multiple threads or even processes to modify any shared variable.This code does not contain any variable which can be shared.
Note : Log (m *n) = Log m + Log n : this is property of logarithm
3) f(n) = , since both outer and inner loop runs n times hence , the total iterations of print statement will be : n +n+n+…+n
for n times, this makes the complexity – n * n = n2
there is no critical section in the code, as a critical section is some part of code which is shared by multiple threads or even processes to modify any shared variable.This code does not contain any variable which can be shared.
You can have a word/ sentence whatever all three. bold, underlined, and italicised. <span />
Answer:
The answer is many top commercial ANNs forgo hidden layers completely.
Explanation:
Hidden layers are artificial neural networks which her directly hidden in between input layers and output layers. It increase the required computation exponentially, improves prediction capabilities and they are not visible as a network output. It ensures there it calculates the weighted inputs and net inputs to produce the actual output. The more the number of hidden layers in a neural network, the longer it takes for it to produce the output and it will enable the neural network to solve more complex problems.