Answer:
Appropriate explanation to the questions below.
Explanation:
<u>Using an Unsorted Array </u>
An unsorted array will take more time to find data hence, findMax will be slow. Time complexity is O(N). Also if we keep the array unsorted, then insert is O(1) -- ignoring the time to expand the array, but deleteMax is O(N) since we must search the whole array for the largest value.
The empty operation is trivial -- just return true if numItems == 0, so it (and the constructor) are both O(1).
<u>Using an Sorted Array </u>
The findMax operation will be faster as compared to unsorted array as either the array is sorted in descending or in ascending order the time complexity will be O(1). Also if we use an array to store the values in the priority queue, we can either store the values in sorted order (which will make the insert operation slow, and the deleteMax operation fast), or in arbitrary order (which will make the insert operation fast, and the deleteMax operation slow). Note also that we'll need a field to keep track of the number of values currently in the queue, we'll need to decide how big to make the array initially, and we'll need to expand the array if it gets full. Here's a partial class definition:
public class PriorityQueue {
private Comparable[] queue;
private int numItems;
private static final int INIT_SIZE = 10;
public PriorityQueue() {
queue = new Comparable[INIT_SIZE];
numItems = 0;
}
}
As mentioned above, if we keep the array sorted, then the insert operation is worst-case O(N) when there are N items in the queue: we can find the place for the new value efficiently using binary search, but then we'll need to move all the larger values over to the right to make room for the new value. As long as we keep the array sorted from low to high, the deleteMax is O(1) -- just decrement numItems and return the last value.
<u>Using a BST </u>
If we use a BST, then the largest value can be found in time proportional to the height of the tree by going right until we reach a node with no right child. Removing that node is also O(tree height), as is inserting a new value. If the tree stays balanced, then these operations are no worse than O(log N); however, if the tree is not balanced, insert and deleteMax can be O(N). As for the linked-list implementation, there is no need for a numItems field.
BST = In avg case it will take o(logn) worst case 0(n)
Using an AVL Tree
AVL Tree:
Insertion/deletion/searching: O(log n) in all cases.
But Complexity isn't everything, there are other considerations for actual performance.
For most purposes, most people don't even use an AVL tree as a balanced tree (Red-Black trees are more common as far as I've seen), let alone as a priority queue.
This is not to say that AVL trees are useless, I quite like them. But they do have a relatively expensive insert. What AVL trees are good for (beating even Red-Black trees) is doing lots and lots of lookups without modification. This is not what you need for a priority queue.
<u> </u>
<u>Using a Heap </u>
When a priority queue is implemented using a heap, the worst-case times for both insert and deleteMax are logarithmic in the number of values in the priority queue.
For the insert operation, we start by adding a value to the end of the array (constant time, assuming the array doesn't have to be expanded); then we swap values up the tree until the order property has been restored. In the worst case, we follow a path all the way from a leaf to the root (i.e., the work we do is proportional to the height of the tree). Because a heap is a balanced binary tree, the height of the tree is O(log N), where N is the number of values stored in the tree.
The deleteMax operation is similar: in the worst case, we follow a path down the tree from the root to a leaf. Again, the worst-case time is O(log N).