473,700 Members | 2,839 Online

# Question on quicksort and mergesort calculations

I was looking at some old posts in comp.lang.c and found the following

I have some questions regarding the post.

First, and I quote

"Given an algorithm with loops or recursive calls, the way you find
a big-O equivalence class for that algorithm is to write down a
"recurrence relation" for the time taken to execute the code. For
instance, for merge sort, one gets (fixed width font required here;
\$x\$ denotes a variable x, etc.):
{
T(n) = { \$a\$, \$n = 1\$, \$a\$ a constant
{ \$2T(n/2) + cn\$, \$n 1\$, \$c\$ a constant
You then apply induction-type reasoning to show that, e.g., when
\$n\$ is a power of two, \$T(n) = an + cn \log_2 n\$. In big-O notation,
all the constant factors vanish, so this shows that mergesort is
O(n log n). "

First, how do you get this recurrence relation for mergesort? Second,
how did he get O(n log n)
And how, later on

"(Personall y, I always found the worst part of dealing with
recurrence
relations to be going from an open form to a closed one -- you just
have to have memorized all those formulae like "1 + 2 + ... + n =
n(n+1)/2", etc., and recognize them in the recurrences. Once you
can see this summation and recognize the closed form, it instantly
becomes obvious that, e.g., bubble sort is O(n*n).) "

I don't see how O(n*n) can be derived from 1 + 2 + ... + n = n(n+1)/2

"
Jun 27 '08 #1
5 1836
On Apr 25, 8:27 am, Chad <cdal...@gmail. comwrote:
I was looking at some old posts in comp.lang.c and found the following

I have some questions regarding the post.

First, and I quote

"Given an algorithm with loops or recursive calls, the way you find
a big-O equivalence class for that algorithm is to write down a
"recurrence relation" for the time taken to execute the code. For
instance, for merge sort, one gets (fixed width font required here;
\$x\$ denotes a variable x, etc.):

{
T(n) = { \$a\$, \$n = 1\$, \$a\$ a constant
{ \$2T(n/2) + cn\$, \$n 1\$, \$c\$ a constant

You then apply induction-type reasoning to show that, e.g., when
\$n\$ is a power of two, \$T(n) = an + cn \log_2 n\$. In big-O notation,
all the constant factors vanish, so this shows that mergesort is
O(n log n). "

First, how do you get this recurrence relation for mergesort?
A merge sort of two or more elements is performed by doing two merge
sorts of half the list (top, then bottom) then merging the list.
Merging the list can be done by examining each element of each of the
list and doing a single comparison and move. There are at most (n-1)
compares, and n moves. So if we are counting comparisons or moves we
get something like:

X(n) <= 2 * X(n/2) + c1 * n

(2 merges of half the list, then a merge of the two half sized lists
to an output of the whole starting list.) For one element, X(1) =
c2. So if we just take a minimal upper bound on the number of
operations T(n) >= X(n), then we get:

T(n) = 2 * T(n/2) + c1 * n, T(1) = c2.
[...] Second, how did he get O(n log n) [...]
Let us suppose that n = 2**(i+1), or i = lg_2(n)-1 ( = lg_2(n/2)) .
We can form the following telescoping sum:

T( n) - 2 * T(n/2 ) = c1 * (n ) = c1 * n
2*T(n/2) - 4 * T(n/4 ) = 2 * c1 * (n / 2) = c1 * n
4*T(n/4) - 8 * T(n/8 ) = 4 * c1 * (n / 4) = c1 * n
8*T(n/8) - 16 * T(n/16) = 8 * c1 * (n / 8) = c1 * n
...
(2**i)*T(2) - (2**(i+1))*T(1) = c1 * n

Then summing vertically we get:

T(n) - (2**(i+1))*T(1) = (i+1) * c1 * n

or:

T(n) = n * c2 + (lg_2(n)) * c1 * n

We have only derived this for n as a power of two, but we can back
substitute to verify that it is correct in general:

(T(n) - c1 * n) / 2
= (n * c2 + (lg_2(n)) * c1 * n - c1 * n) / 2
= (n * c2 + (lg_2(n)-1) * c1 * n) / 2
= (n/2) * c2 + (lg_2(n) - 1) * c1 * (n / 2)
= (n/2) * c2 + (lg_2(n/2)) * c1 * (n / 2)
= T(n/2)

So we are set. In any event the dominant term with respect to n is
(lg_2(n)) * c1 * n so we can conclude that T(n) is O(n*ln(n)).
And how, later on

"(Personall y, I always found the worst part of dealing with
recurrence relations to be going from an open form to a closed one
-- you just have to have memorized all those formulae like "1 + 2
+ ... + n = n(n+1)/2", etc., and recognize them in the recurrences.
Once you can see this summation and recognize the closed form, it
instantly becomes obvious that, e.g., bubble sort is O(n*n).) "
Whatever; this is just basic mathematical skill.
I don't see how O(n*n) can be derived from 1 + 2 + ... + n = n(n+1)/2
n(n+1)/2 = n**2/2 + n/2 which is dominated by n**2/2. More rigorously:

lim{n->inf} ( (n*(n+1)/2) / (n**2/2) ) = 1

which just brings us back to the basic definition of O(f(n)).

--
Paul Hsieh
http://www.pobox.com/~qed/
http://bstring.sf.net/
Jun 27 '08 #2
On Apr 25, 8:27*am, Chad <cdal...@gmail. comwrote:
I was looking at some old posts in comp.lang.c and found the following

I have some questions regarding the post.

First, and I quote

"Given an algorithm with loops or recursive calls, the way you find
a big-O equivalence class for that algorithm is to write down a
"recurrence relation" for the time taken to execute the code. *For
instance, for merge sort, one gets (fixed width font required here;
\$x\$ denotes a variable x, etc.):

* * * * * * * * {
* * * * T(n) = *{ \$a\$, * * * * * * *\$n = 1\$, \$a\$ a constant
* * * * * * * * { \$2T(n/2) + cn\$, * \$n 1\$, \$c\$ a constant

You then apply induction-type reasoning to show that, e.g., when
\$n\$ is a power of two, \$T(n) = an + cn \log_2 n\$. *In big-O notation,
all the constant factors vanish, so this shows that mergesort is
O(n log n). "

First, how do you get this recurrence relation for mergesort? Second,
how did he get O(n log n)

And how, later on

"(Personall y, I always found the worst part of dealing with
recurrence
relations to be going from an open form to a closed one -- you just
have to have memorized all those formulae like "1 + 2 + ... + n =
n(n+1)/2", etc., and recognize them in the recurrences. *Once you
can see this summation and recognize the closed form, it instantly
becomes obvious that, e.g., bubble sort is O(n*n).) "

I don't see how O(n*n) can be derived from 1 + 2 + ... + n = *n(n+1)/2

"
Actually, now that I think about it, I have no idea how one gets the
following summation

1 + 2 + 3 +....n

for bubble sort.
Jun 27 '08 #3

<snip>
>>
I don't see how O(n*n) can be derived from 1 + 2 + ... + n = n(n+1)/2
n(n+1)/2 is (n*n + n)/2. To get O(n*n) from this, bear in mind that n grows
insignificantly compared to n*n as n increases, and big-O is a sort of
big-picture measure that loses nitty-gritty detail such as n in the face
of something big and obvious like n*n. So we're down to (n*n)/2. To get
from there to n*n, simply buy a computer that runs at half the speed. (In
other words, constant factors aren't terribly interesting when compared to
the rate of growth of n - they don't change the /shape/ of the algorithmic
complexity.)
Actually, now that I think about it, I have no idea how one gets the
following summation

1 + 2 + 3 +....n

for bubble sort.
It's actually 1 + 2 + 3 + ... n-1

Consider { 6, 5, 4, 3, 2, 1 }

To sort this six-element array using bubble sort, you bubble the biggest
element to the right (five comparisons and swaps), and then you have an
unsorted five-element array and one sorted item. To sort the five-element
array using bubble sort, you bubble the biggest element to the right (four
comparisons and swaps), and then you have an unsorted four-element array
and two sorted items. To sort the four-element array using bubble sort,
you bubble the biggest element to the right (three comparisons and swaps),
and then you have an unsorted three-element array and three sorted items.

And so on. 5 + 4 + 3 + 2 + 1. It sums to (n-1)(n-2)/2, which is O(n*n).

--
Richard Heathfield <http://www.cpax.org.uk >
Email: -http://www. +rjh@
"Usenet is a strange place" - dmr 29 July 1999
Jun 27 '08 #4
On Apr 25, 7:16*pm, Richard Heathfield <r...@see.sig.i nvalidwrote:

<snip>
I don't see how O(n*n) can be derived from 1 + 2 + ... + n = *n(n+1)/2

n(n+1)/2 is (n*n + n)/2. To get O(n*n) from this, bear in mind that n grows
insignificantly compared to n*n as n increases, and big-O is a sort of
big-picture measure that loses nitty-gritty detail such as n in the face
of something big and obvious like n*n. So we're down to (n*n)/2. To get
from there to n*n, simply buy a computer that runs at half the speed. (In
other words, constant factors aren't terribly interesting when compared to
the rate of growth of n - they don't change the /shape/ of the algorithmic
complexity.)
Actually, now that I think about it, I have no idea how one gets the
following summation
1 + 2 + 3 +....n
for bubble sort.

It's actually 1 + 2 + 3 + ... n-1

Consider { 6, 5, 4, 3, 2, 1 }

To sort this six-element array using bubble sort, you bubble the biggest
element to the right (five comparisons and swaps), and then you have an
unsorted five-element array and one sorted item. To sort the five-element
array using bubble sort, you bubble the biggest element to the right (four
comparisons and swaps), and then you have an unsorted four-element array
and two sorted items. To sort the four-element array using bubble sort,
you bubble the biggest element to the right (three comparisons and swaps),
and then you have an unsorted three-element array and three sorted items.

And so on. 5 + 4 + 3 + 2 + 1. It sums to (n-1)(n-2)/2, which is O(n*n).
Thank you for that clarification. Now, one last question. When Paul
sums up

T( n) - 2 * T(n/2 ) = c1 * (n ) = c1 * n
2*T(n/2) - 4 * T(n/4 ) = 2 * c1 * (n / 2) = c1 * n
4*T(n/4) - 8 * T(n/8 ) = 4 * c1 * (n / 4) = c1 * n
8*T(n/8) - 16 * T(n/16) = 8 * c1 * (n / 8) = c1 * n
...

He gets

T(n) - (2**(i+1))*T(1) = (i+1) * c1 * n
I don't see where he gets T(1) from. Maybe to get some insight into my
confusion, he has
T(n) = 2 * T(n/2) + c1 * n, T(1) = c2

I also don't see why he would introduce a second constant called c2
for T(1).
Jun 27 '08 #5
On Apr 25, 7:27*pm, Chad <cdal...@gmail. comwrote:
On Apr 25, 7:16*pm, Richard Heathfield <r...@see.sig.i nvalidwrote:

<snip>
>I don't see how O(n*n) can be derived from 1 + 2 + ... + n = *n(n+1)/2
n(n+1)/2 is (n*n + n)/2. To get O(n*n) from this, bear in mind that n grows
insignificantly compared to n*n as n increases, and big-O is a sort of
big-picture measure that loses nitty-gritty detail such as n in the face
of something big and obvious like n*n. So we're down to (n*n)/2. To get
from there to n*n, simply buy a computer that runs at half the speed. (In
other words, constant factors aren't terribly interesting when compared to
the rate of growth of n - they don't change the /shape/ of the algorithmic
complexity.)
Actually, now that I think about it, I have no idea how one gets the
following summation
1 + 2 + 3 +....n
for bubble sort.
It's actually 1 + 2 + 3 + ... n-1
Consider { 6, 5, 4, 3, 2, 1 }
To sort this six-element array using bubble sort, you bubble the biggest
element to the right (five comparisons and swaps), and then you have an
unsorted five-element array and one sorted item. To sort the five-element
array using bubble sort, you bubble the biggest element to the right (four
comparisons and swaps), and then you have an unsorted four-element array
and two sorted items. To sort the four-element array using bubble sort,
you bubble the biggest element to the right (three comparisons and swaps),
and then you have an unsorted three-element array and three sorted items..
And so on. 5 + 4 + 3 + 2 + 1. It sums to (n-1)(n-2)/2, which is O(n*n).

Thank you for that clarification. Now, one last question. When Paul
sums up

* *T( *n) - *2 * T(n/2 ) = * * c1 * (n * *) = c1 * n
* *2*T(n/2) - *4 * T(n/4 ) = 2 * c1 * (n / 2) = c1 * n
* *4*T(n/4) - *8 * T(n/8 ) = 4 * c1 * (n / 4) = c1 * n
* *8*T(n/8) - 16 * T(n/16) = 8 * c1 * (n / 8) = c1 * n
* *...

He gets

T(n) - (2**(i+1))*T(1) = (i+1) * c1 * n

I don't see where he gets T(1) from. Maybe to get some insight into my
confusion, he has
T(n) *= 2 * T(n/2) + c1 * n, T(1) = c2

I also don't see why he would introduce a second constant called c2
for T(1).- Hide quoted text -

- Show quoted text -
Okay, I just sat and thought about how the merge sort derivation Paul
did and now I see how his calculation works.
Jun 27 '08 #6

This thread has been closed and replies have been disabled. Please start a new discussion.