MergeSort belongs to the group of recursiv dividing algorithms. The input list of data records is simply divided into two sublists and the algorithm calls itself again for both sublists. When on a certain recursion level there is only one element in the list, the subroutine returns with a sorted list (in this case one element long). On any return level two sorted sublists are merged and a sorted list is returned. Merging is done in a simple way: there is a pointer to the beginning of the first list and another one to the beginning of the second list. Now the key values are compared and the record with the smaller value goes into the resulting list and the pointer is increased. If one of the sublists is empty, all the elements of the other list can be copied without any additional compare operation to the result.
With a little trick MergeSort can be made stable. First thing is that all records being nearer to the beginning of a list go into the first sublist. Then, when during the merge operation records with identical key values are found, the record of the first sublist goes into the sorted result. In this way the records with identical key value will be in the same order as in the original list.
The time necessary to divide a list into two sublists can be neglected. So we need to make an estimate, how many compare operations and swaps are necessary.
Since the list is divided into two (more or less) equal sublists, the number of recursion levels is given by
- L = lg2(n)
On each recursion level the keys are compared with each other during the merge. When there are two sublists with ten records each and they are evenly distributed, 19 compares are necessary to get the resulting list. For making maths easier we assume that 20 compares have to be done. As a result we can calculate n compare operations and n moves on each recursion level. The average time necessary for a sort can be estimated with
- T = F * n * lg2(n)
where F is a constant reflecting the implementation. If MergeSort is used without any additions, this estimate is correct for the worst case as well. There is a best case situation when the input list is allready sorted. Then, when merging the sublists, one sublist is copied completely to the result before any record of the other sublist is moved. Then only half of the compare operations are necessary but still all move operations. Since normally a move operation takes much more time than a compare operation, the difference will not be of importance.
There is a nice trick to speed up sorting when the input list is near to sorted. When generating the sublists the key data are checked for so called runs, sequences of already sorted records. These sequences are not subdivided. As a result the best case timing when checking for runs is
- T = G * n
where G is a constant reflecting the implementation.
We start with this input list
and split it into two lists.
Now we split list L2 into two lists.
List L4 is still too long and we split it.
The two lists can not be split again, we merge them to form the sorted list 4.
Now list 4 and 5 can be merged in order to form the sorted list 2 again
We split list 3
These lists can not split again, so we merge them to form sorted list 3
Now list 2 and 3 can be recombined to form the sorted list 1.