Welcome to OStack Knowledge Sharing Community for programmer and developer-Open, Learning and Share
Welcome To Ask or Share your Answers For Others

Categories

0 votes
915 views
in Technique[技术] by (71.8m points)

algorithm - amortized analysis on min-heap?

If on empty min heap we doing n arbitrary insert and delete operations, (with given location of delete in min-heap). why the amortized analysis for insert is O(1) and delete is O(log n)?

a) insert O(log n), delete O(1)

b) insert O(log n), delete O(log n) 

c) insert O(1), delete O(1)

d) insert O(1), delete O(log n)

any person could clarify it for me?

See Question&Answers more detail:os

与恶龙缠斗过久,自身亦成为恶龙;凝视深渊过久,深渊将回以凝视…
Welcome To Ask or Share your Answers For Others

1 Answer

0 votes
by (71.8m points)

Based on your question and responses to comments, I'm going to assume a binary heap.

First, the worst case for insertion is O(log n) and the worst case for removal of the smallest item is O(log n). This follows from the tree structure of the heap. That is, for a heap of n items, there are log(n) levels in the tree.

Insertion involves (logically) adding the item as the lowest right-most node in the tree and then "bubbling" it up to the required level. If the new item is smaller than the root, then it has to bubble all the way to the top--all log(n) levels. So if you insert the numbers 10, 9, 8, 7, 6, 5, 4, 3, 2, 1 into a min-heap, you'll hit the worst case for every insertion.

Removal of the smallest element involves replacing the lowest item (the root) with the last item and then "sifting" the item down to its proper position. Again, this can take up to log(n) operations.

That's the worst case. The average case is much different.

Remember that in a binary heap, half of the nodes are leafs--they have no children. So if you're inserting items in random order, half the time the item you're inserting will belong on the lowest level and there is no "bubble up" to do. So half the time your insert operation is O(1). Of the other half, half of those will belong on the second level up. And so on. The only time you actually do log(n) operations on insert is when the item you're inserting is smaller than the existing root item. It's quite possible, then, that the observed runtime behavior is that insertion is O(1). In fact that will be the behavior if you insert a sorted array into a min-heap. That is, if you were to insert the values 1, 2, 3, 4, 5, 6, 7, 8, 9, 10 in that order.

When removing the smallest item from a min-heap, you take the last item from the heap and sift it down from the top. The "half the time" rule comes into play again, but this time it's working against you. That last item you took from the heap probably belongs down there on the lowest level. So you have to sift it all the way back down, which takes log(n) operations. Half the time you'll have do to all log(n) operations. Half of the remaining you'll need to do all but one of them, etc. And in fact the minimum number of levels you have to sift down will depend on the depth of the tree. For example, if your heap has more than three items then you know that removing the smallest item will require at least one sift-down operation because the next-lowest item is always on the second level of the tree.

It turns out, then, that in the average case insertion into a binary heap takes much less than O(log n) time. It's likely closer to O(1). And removal from a binary heap is much closer to the worst case of O(log n).


与恶龙缠斗过久,自身亦成为恶龙;凝视深渊过久,深渊将回以凝视…
Welcome to OStack Knowledge Sharing Community for programmer and developer-Open, Learning and Share
Click Here to Ask a Question

...