) Suppose algorithm A takes 10 seconds to handle a data set of 1000 records. Suppose the algorithm A is of complexity O(n2). Ans
wer the following: (i) Approximately how long will it take to handle a data set of 1500 records? Why? (ii) How long will it take to handle a data set of 5000 records? Can you come up with a reason why this will not be entirely accurate but will just be an approximation?
An algorithm with O(n) efficiency is described as a linearly increasing algorithm, this mean that the rate at which the input increases is linear to the time needed to compute that algorithm with n inputs.
From the question, it is given that for 1000 records, the time required is: 10 seconds.
Algorithm time taken is O(n)
Hence,
1) For 1,500 records
=> 10/1000 = x/1500
=> 10x1500/1000 = x
x = 15 seconds
Thus, the time taken for 1500 records = 15 seconds.
2) For 5,000 records
=> 10/1000 = x/5000
=> 10x5000/1000 = x
x = 50 seconds
Thus, the time taken for 1500 records = 50 seconds.
The Motherboard contains instructions for the OS for hardware devices, such as the keyboard, mouse, and video card, and are stored in the systemroot\Windows\System32\Drivers folder.