try searching google or wikipedia or (gasp!) yahoo for 'priority queue', as well as 'heap data structure'.
The simple, brute-force approach costs m for each insertion into your result list, since you search the head of all m lists for the min value to push to your result array. This costs O(m).
The more efficient approach stores each list-head-pointer in a min heap. so accessing the min value and rebuilding the heap costs O(log m).
now, i think you need to do this comparison m*n times, not just n times, so I think the complexity of the more efficient approach is still O(nmlog m), while the less-efficient one is O(nm^2).
2006-08-24 21:43:29
·
answer #1
·
answered by ouksta 2
·
0⤊
0⤋
"What is *THE* algorithm" implies that there is exactly one with O(n log m) complexity. There's probably more than one algorithm so the correct question is "What is *an* algorithm..." Having said that, the algorithm you're searching for is probably mentioned in your class notes or textbook.
2006-08-25 04:52:54
·
answer #2
·
answered by Valdis K 6
·
0⤊
0⤋
it is called merge sort; for the simple fact that you are merging lists in its recursion step.
2006-08-25 03:21:17
·
answer #3
·
answered by Andy T 7
·
0⤊
0⤋