4. Only (ii)
<u>Explanation:</u>
The declaration of the array can be of two types:
1. int a[100];
2. int[] a = new int[100];
The general thing about an array is that whenever we want to undergo traversal in an array, we always have to start from the 0th position as the size of the array may be a whole number (let us say 10). So, to undergo traversal in an array, we start from 0 to n-1 (in this case 9) such that it covers the size of the array.
The size of an array can be finite or infinite. The general rule is it starts from 0 to n-1 where n is the size of the array. In the above example, the range of the index of the array will be 0 through 100 and not 1 through 100.
Answer: option d is correct
Explanation:
It improves quality and efficiency of computer systems. They can have part to whole relations, extrapolations, or predictions.
Answer:
In this case, the country that produces rye will produce 24 million bushels per week, and the country that produces jeans will produce 64 million pairs per week.
Explanation:
Total labor hour = 4 million hour
Number of bushes produce in 1 hour = 6
⇒total bushes produce = 6*4 = 24 million
∴ we get
The country that produces rye will produce 24 million bushels per week
Now,
Total labor hour = 4 million hour
Number of pairs produce in 1 hour = 16
⇒total bushes produce = 16*4 = 64 million
∴ we get
The country that produces jeans will produce 64 million pairs per week.
Answer: Provided in the explanation section
Explanation:
The question to this problem says;
Question:
I am sorting data that is stored over a network connection. Based on the properties of that connection, it is extremely expensive to "swap" two elements. But looping over the elements and looking at their values is very inexpensive. I want to minimize swaps above all other factors. Choose the sorting algorithm we studied that will perform the best:
ANSWER
1. Merge Sort
Because merge sort uses additional memory instead of swapping the elements.
2. Merge Sort and Quick Sort both can be used with multi processor.
cheers i hope this helps !!!
Answer:
Generation in computer terminology is a change in technology a computer is/was being used. Initially, the generation term was used to distinguish between varying hardware technologies. Nowadays, generation includes both hardware and software, which together make up an entire computer system.
Explanation: