Sunday, August 06, 2006

Merge sort

In computer science, merge sort or mergesort is a sorting algorithm for rearranging lists (or any other data structure that can only be accessed sequentially, e.g. file streams) into a specified order. It is a particularly good example of the divide and conquer algorithmic paradigm. It is a comparison sort.

Algorithm

Conceptually, merge sort works as follows:
Divide the unsorted list into two sublists of about half the size
Sort each of the two sublists
Merge the two sorted sublists back into one sorted list.
The algorithm was invented by John von Neumann in 1945.
In a simple pseudocode form, the algorithm could look something like this:


function mergesort(m)
var list left, right
if length(m) ≤ 1
return m
else
middle = length(m) / 2
for each x in m up to middle
add x to left
for each x in m after middle
add x to right
left = mergesort(left)
right = mergesort(right)
result = merge(left, right)
return result


There are several variants for the merge() function, the simplest variant could look like this:

function merge(left,right)
var list result
while length(left) > 0 and length(right) > 0
if first(left) ≤ first(right)
append first(left) to result
left = rest(left)
else
append first(right) to result
right = rest(right)
if length(left) > 0
append left to result
if length(right) > 0
append right to result
return result

Analysis


In sorting n items, merge sort has an average and worst-case performance of O(n log n). If the running time of merge sort for a list of length n is T(n), then the recurrence T(n) = 2T(n/2) + n follows from the definition of the algorithm (apply the algorithm to two lists of half the size of the original list, and add the n steps taken to merge the resulting two lists). The closed form follows from the master theorem.
In the worst case, merge sort does exactly (n ⌈log n⌉ - 2⌈log n⌉ + 1) comparisons, which is between (n log n - n + 1) and(n log n - 0.9139·n + 1) [logs are base 2]. Note, the worst case number given here does not agree with that given in Knuth's Art of Computer Programming, Vol 3. The discrepancy is due to Knuth analyzing a variant implementation of merge sort that is slightly sub-optimal.
For large n and a randomly ordered input list, merge sort's expected (average) number of comparisons approaches α·n fewer than the worst case, where



\alpha = -1 + \sum_{k=0}^\infty \frac1{2^k+1} \approx 0.2645


In the worst case, merge sort does about 39% fewer comparisons than quicksort does in the average case; merge sort always makes fewer comparisons than quicksort, except in extremely rare cases, when they tie, where merge sort's worst case is found simultaneously with quicksort's best case. In terms of moves, merge sort's worst case complexity is O(n log n)—the same complexity as quicksort's best case, and merge sort's best case takes about half as many iterations as the worst case.
Recursive implementations of merge sort make 2n - 1 method calls in the worst case, compared to quicksort's n, thus has roughly twice as much recursive overhead as quicksort. However, iterative, non-recursive, implementations of merge sort, avoiding method call overhead, are not difficult to code. Merge sort's most common implementation does not sort in place, therefore, the memory size of the input must be allocated for the sorted output to be stored in. Sorting in-place is possible (see, for example, [1]) but is very complicated, and will offer little performance gains in practice, even if the algorithm runs in O(n log n) time. In these cases, algorithms like heapsort usually offer comparable speed, and are far less complex.
Merge sort is much more efficient than quicksort if the data to be sorted can only be efficiently accessed sequentially, and is thus popular in languages such as Lisp, where sequentially accessed data structures are very common. Unlike some (inefficient) implementations of quicksort, merge sort is a stable sort as long as the merge operation is implemented properly
Merge sorting tape drives
Merge sort is so inherently sequential that it's practical to run it using slow tape drives as input and output devices. It requires very little memory, and the memory required does not change with the number of data elements. If you have four tape drives, it works as follows:

divide the data to be sorted in half and put half on each of two tapes
merge individual pairs of records from the two tapes; write two-record chunks alternately to each of the two output tapes
merge the two-record chunks from the two output tapes into four-record chunks; write these alternately to the original two input tapes
merge the four-record chunks into eight-record chunks; write these alternately to the original two output tapes
repeat until you have one chunk containing all the data, sorted --- that is, for log n passes, where n is the number of records.

For the same reason it is also very useful for sorting data on disk that is too large to fit entirely into primary memory. On tape drives that can run both backwards and forwards, you can run merge passes in both directions, avoiding rewind time.