473,385 Members | 1,912 Online
Bytes | Software Development & Data Engineering Community
Post Job

Home Posts Topics Members FAQ

Join Bytes to post your question to a community of 473,385 software developers and data experts.

Question on quicksort and mergesort calculations

I was looking at some old posts in comp.lang.c and found the following

http://groups.google.com/group/comp....b5046326994e18

I have some questions regarding the post.

First, and I quote

"Given an algorithm with loops or recursive calls, the way you find
a big-O equivalence class for that algorithm is to write down a
"recurrence relation" for the time taken to execute the code. For
instance, for merge sort, one gets (fixed width font required here;
$x$ denotes a variable x, etc.):
{
T(n) = { $a$, $n = 1$, $a$ a constant
{ $2T(n/2) + cn$, $n 1$, $c$ a constant
You then apply induction-type reasoning to show that, e.g., when
$n$ is a power of two, $T(n) = an + cn \log_2 n$. In big-O notation,
all the constant factors vanish, so this shows that mergesort is
O(n log n). "

First, how do you get this recurrence relation for mergesort? Second,
how did he get O(n log n)
And how, later on

"(Personally, I always found the worst part of dealing with
recurrence
relations to be going from an open form to a closed one -- you just
have to have memorized all those formulae like "1 + 2 + ... + n =
n(n+1)/2", etc., and recognize them in the recurrences. Once you
can see this summation and recognize the closed form, it instantly
becomes obvious that, e.g., bubble sort is O(n*n).) "

I don't see how O(n*n) can be derived from 1 + 2 + ... + n = n(n+1)/2

"
Jun 27 '08 #1
5 1818
On Apr 25, 8:27 am, Chad <cdal...@gmail.comwrote:
I was looking at some old posts in comp.lang.c and found the following

http://groups.google.com/group/comp....d/thread/d26ab...

I have some questions regarding the post.

First, and I quote

"Given an algorithm with loops or recursive calls, the way you find
a big-O equivalence class for that algorithm is to write down a
"recurrence relation" for the time taken to execute the code. For
instance, for merge sort, one gets (fixed width font required here;
$x$ denotes a variable x, etc.):

{
T(n) = { $a$, $n = 1$, $a$ a constant
{ $2T(n/2) + cn$, $n 1$, $c$ a constant

You then apply induction-type reasoning to show that, e.g., when
$n$ is a power of two, $T(n) = an + cn \log_2 n$. In big-O notation,
all the constant factors vanish, so this shows that mergesort is
O(n log n). "

First, how do you get this recurrence relation for mergesort?
A merge sort of two or more elements is performed by doing two merge
sorts of half the list (top, then bottom) then merging the list.
Merging the list can be done by examining each element of each of the
list and doing a single comparison and move. There are at most (n-1)
compares, and n moves. So if we are counting comparisons or moves we
get something like:

X(n) <= 2 * X(n/2) + c1 * n

(2 merges of half the list, then a merge of the two half sized lists
to an output of the whole starting list.) For one element, X(1) =
c2. So if we just take a minimal upper bound on the number of
operations T(n) >= X(n), then we get:

T(n) = 2 * T(n/2) + c1 * n, T(1) = c2.
[...] Second, how did he get O(n log n) [...]
Let us suppose that n = 2**(i+1), or i = lg_2(n)-1 ( = lg_2(n/2)) .
We can form the following telescoping sum:

T( n) - 2 * T(n/2 ) = c1 * (n ) = c1 * n
2*T(n/2) - 4 * T(n/4 ) = 2 * c1 * (n / 2) = c1 * n
4*T(n/4) - 8 * T(n/8 ) = 4 * c1 * (n / 4) = c1 * n
8*T(n/8) - 16 * T(n/16) = 8 * c1 * (n / 8) = c1 * n
...
(2**i)*T(2) - (2**(i+1))*T(1) = c1 * n

Then summing vertically we get:

T(n) - (2**(i+1))*T(1) = (i+1) * c1 * n

or:

T(n) = n * c2 + (lg_2(n)) * c1 * n

We have only derived this for n as a power of two, but we can back
substitute to verify that it is correct in general:

(T(n) - c1 * n) / 2
= (n * c2 + (lg_2(n)) * c1 * n - c1 * n) / 2
= (n * c2 + (lg_2(n)-1) * c1 * n) / 2
= (n/2) * c2 + (lg_2(n) - 1) * c1 * (n / 2)
= (n/2) * c2 + (lg_2(n/2)) * c1 * (n / 2)
= T(n/2)

So we are set. In any event the dominant term with respect to n is
(lg_2(n)) * c1 * n so we can conclude that T(n) is O(n*ln(n)).
And how, later on

"(Personally, I always found the worst part of dealing with
recurrence relations to be going from an open form to a closed one
-- you just have to have memorized all those formulae like "1 + 2
+ ... + n = n(n+1)/2", etc., and recognize them in the recurrences.
Once you can see this summation and recognize the closed form, it
instantly becomes obvious that, e.g., bubble sort is O(n*n).) "
Whatever; this is just basic mathematical skill.
I don't see how O(n*n) can be derived from 1 + 2 + ... + n = n(n+1)/2
n(n+1)/2 = n**2/2 + n/2 which is dominated by n**2/2. More rigorously:

lim{n->inf} ( (n*(n+1)/2) / (n**2/2) ) = 1

which just brings us back to the basic definition of O(f(n)).

--
Paul Hsieh
http://www.pobox.com/~qed/
http://bstring.sf.net/
Jun 27 '08 #2
On Apr 25, 8:27*am, Chad <cdal...@gmail.comwrote:
I was looking at some old posts in comp.lang.c and found the following

http://groups.google.com/group/comp....d/thread/d26ab...

I have some questions regarding the post.

First, and I quote

"Given an algorithm with loops or recursive calls, the way you find
a big-O equivalence class for that algorithm is to write down a
"recurrence relation" for the time taken to execute the code. *For
instance, for merge sort, one gets (fixed width font required here;
$x$ denotes a variable x, etc.):

* * * * * * * * {
* * * * T(n) = *{ $a$, * * * * * * *$n = 1$, $a$ a constant
* * * * * * * * { $2T(n/2) + cn$, * $n 1$, $c$ a constant

You then apply induction-type reasoning to show that, e.g., when
$n$ is a power of two, $T(n) = an + cn \log_2 n$. *In big-O notation,
all the constant factors vanish, so this shows that mergesort is
O(n log n). "

First, how do you get this recurrence relation for mergesort? Second,
how did he get O(n log n)

And how, later on

"(Personally, I always found the worst part of dealing with
recurrence
relations to be going from an open form to a closed one -- you just
have to have memorized all those formulae like "1 + 2 + ... + n =
n(n+1)/2", etc., and recognize them in the recurrences. *Once you
can see this summation and recognize the closed form, it instantly
becomes obvious that, e.g., bubble sort is O(n*n).) "

I don't see how O(n*n) can be derived from 1 + 2 + ... + n = *n(n+1)/2

"
Actually, now that I think about it, I have no idea how one gets the
following summation

1 + 2 + 3 +....n

for bubble sort.
Jun 27 '08 #3
Chad said:

<snip>
>>
I don't see how O(n*n) can be derived from 1 + 2 + ... + n = n(n+1)/2
n(n+1)/2 is (n*n + n)/2. To get O(n*n) from this, bear in mind that n grows
insignificantly compared to n*n as n increases, and big-O is a sort of
big-picture measure that loses nitty-gritty detail such as n in the face
of something big and obvious like n*n. So we're down to (n*n)/2. To get
from there to n*n, simply buy a computer that runs at half the speed. (In
other words, constant factors aren't terribly interesting when compared to
the rate of growth of n - they don't change the /shape/ of the algorithmic
complexity.)
Actually, now that I think about it, I have no idea how one gets the
following summation

1 + 2 + 3 +....n

for bubble sort.
It's actually 1 + 2 + 3 + ... n-1

Consider { 6, 5, 4, 3, 2, 1 }

To sort this six-element array using bubble sort, you bubble the biggest
element to the right (five comparisons and swaps), and then you have an
unsorted five-element array and one sorted item. To sort the five-element
array using bubble sort, you bubble the biggest element to the right (four
comparisons and swaps), and then you have an unsorted four-element array
and two sorted items. To sort the four-element array using bubble sort,
you bubble the biggest element to the right (three comparisons and swaps),
and then you have an unsorted three-element array and three sorted items.

And so on. 5 + 4 + 3 + 2 + 1. It sums to (n-1)(n-2)/2, which is O(n*n).

--
Richard Heathfield <http://www.cpax.org.uk>
Email: -http://www. +rjh@
Google users: <http://www.cpax.org.uk/prg/writings/googly.php>
"Usenet is a strange place" - dmr 29 July 1999
Jun 27 '08 #4
On Apr 25, 7:16*pm, Richard Heathfield <r...@see.sig.invalidwrote:
Chad said:

<snip>
I don't see how O(n*n) can be derived from 1 + 2 + ... + n = *n(n+1)/2

n(n+1)/2 is (n*n + n)/2. To get O(n*n) from this, bear in mind that n grows
insignificantly compared to n*n as n increases, and big-O is a sort of
big-picture measure that loses nitty-gritty detail such as n in the face
of something big and obvious like n*n. So we're down to (n*n)/2. To get
from there to n*n, simply buy a computer that runs at half the speed. (In
other words, constant factors aren't terribly interesting when compared to
the rate of growth of n - they don't change the /shape/ of the algorithmic
complexity.)
Actually, now that I think about it, I have no idea how one gets the
following summation
1 + 2 + 3 +....n
for bubble sort.

It's actually 1 + 2 + 3 + ... n-1

Consider { 6, 5, 4, 3, 2, 1 }

To sort this six-element array using bubble sort, you bubble the biggest
element to the right (five comparisons and swaps), and then you have an
unsorted five-element array and one sorted item. To sort the five-element
array using bubble sort, you bubble the biggest element to the right (four
comparisons and swaps), and then you have an unsorted four-element array
and two sorted items. To sort the four-element array using bubble sort,
you bubble the biggest element to the right (three comparisons and swaps),
and then you have an unsorted three-element array and three sorted items.

And so on. 5 + 4 + 3 + 2 + 1. It sums to (n-1)(n-2)/2, which is O(n*n).
Thank you for that clarification. Now, one last question. When Paul
sums up

T( n) - 2 * T(n/2 ) = c1 * (n ) = c1 * n
2*T(n/2) - 4 * T(n/4 ) = 2 * c1 * (n / 2) = c1 * n
4*T(n/4) - 8 * T(n/8 ) = 4 * c1 * (n / 4) = c1 * n
8*T(n/8) - 16 * T(n/16) = 8 * c1 * (n / 8) = c1 * n
...

He gets

T(n) - (2**(i+1))*T(1) = (i+1) * c1 * n
I don't see where he gets T(1) from. Maybe to get some insight into my
confusion, he has
T(n) = 2 * T(n/2) + c1 * n, T(1) = c2

I also don't see why he would introduce a second constant called c2
for T(1).
Jun 27 '08 #5
On Apr 25, 7:27*pm, Chad <cdal...@gmail.comwrote:
On Apr 25, 7:16*pm, Richard Heathfield <r...@see.sig.invalidwrote:


Chad said:
<snip>
>I don't see how O(n*n) can be derived from 1 + 2 + ... + n = *n(n+1)/2
n(n+1)/2 is (n*n + n)/2. To get O(n*n) from this, bear in mind that n grows
insignificantly compared to n*n as n increases, and big-O is a sort of
big-picture measure that loses nitty-gritty detail such as n in the face
of something big and obvious like n*n. So we're down to (n*n)/2. To get
from there to n*n, simply buy a computer that runs at half the speed. (In
other words, constant factors aren't terribly interesting when compared to
the rate of growth of n - they don't change the /shape/ of the algorithmic
complexity.)
Actually, now that I think about it, I have no idea how one gets the
following summation
1 + 2 + 3 +....n
for bubble sort.
It's actually 1 + 2 + 3 + ... n-1
Consider { 6, 5, 4, 3, 2, 1 }
To sort this six-element array using bubble sort, you bubble the biggest
element to the right (five comparisons and swaps), and then you have an
unsorted five-element array and one sorted item. To sort the five-element
array using bubble sort, you bubble the biggest element to the right (four
comparisons and swaps), and then you have an unsorted four-element array
and two sorted items. To sort the four-element array using bubble sort,
you bubble the biggest element to the right (three comparisons and swaps),
and then you have an unsorted three-element array and three sorted items..
And so on. 5 + 4 + 3 + 2 + 1. It sums to (n-1)(n-2)/2, which is O(n*n).

Thank you for that clarification. Now, one last question. When Paul
sums up

* *T( *n) - *2 * T(n/2 ) = * * c1 * (n * *) = c1 * n
* *2*T(n/2) - *4 * T(n/4 ) = 2 * c1 * (n / 2) = c1 * n
* *4*T(n/4) - *8 * T(n/8 ) = 4 * c1 * (n / 4) = c1 * n
* *8*T(n/8) - 16 * T(n/16) = 8 * c1 * (n / 8) = c1 * n
* *...

He gets

T(n) - (2**(i+1))*T(1) = (i+1) * c1 * n

I don't see where he gets T(1) from. Maybe to get some insight into my
confusion, he has
T(n) *= 2 * T(n/2) + c1 * n, T(1) = c2

I also don't see why he would introduce a second constant called c2
for T(1).- Hide quoted text -

- Show quoted text -
Okay, I just sat and thought about how the merge sort derivation Paul
did and now I see how his calculation works.
Jun 27 '08 #6

This thread has been closed and replies have been disabled. Please start a new discussion.

Similar topics

36
by: HERC777 | last post by:
just paste the below into notepad, call it iqsort.html double click it and your IE browser will run it. (neat! a universal programming language we can all run!) it falls out from the sort...
1
by: Jamal | last post by:
I am working on binary files of struct ACTIONS I have a recursive qsort/mergesort hybrid that 1) i'm not a 100% sure works correctly 2) would like to convert to iteration Any comments or...
6
by: Jamal | last post by:
I am working on binary files of struct ACTIONS I have a recursive qsort/mergesort hybrid that 1) i'm not a 100% sure works correctly 2) would like to convert to iteration Any comments or...
45
by: Robbie Hatley | last post by:
Hello, group. I've been doing too much C++ programming lately, and I'm starting to become rusty at some aspects of the C way of doing things, esp. efficient low-level data copies. ...
2
by: rkk | last post by:
Hi, My mergesort program is below. There is a small piece of logical error/bug in this code which I can't figure out, as a result the output array isn't completely sorted. Requesting your help...
6
by: Baltazar007 | last post by:
Does anyone know how to make quicksort for single linked list without using array? I know that mergesort is maybe better but I need quicksort!! thnx
51
by: Joerg Schoen | last post by:
Hi folks! Everyone knows how to sort arrays (e. g. quicksort, heapsort etc.) For linked lists, mergesort is the typical choice. While I was looking for a optimized implementation of mergesort...
14
by: subramanian100in | last post by:
What is meant by stable qsort ?
30
by: gaoxtwarrior | last post by:
a sort which is stable means it keeps the object's original order. A sort which is in place is stable. does anyone can explain me what is sort in place? thx.
0
by: Charles Arthur | last post by:
How do i turn on java script on a villaon, callus and itel keypad mobile phone
0
by: aa123db | last post by:
Variable and constants Use var or let for variables and const fror constants. Var foo ='bar'; Let foo ='bar';const baz ='bar'; Functions function $name$ ($parameters$) { } ...
0
by: emmanuelkatto | last post by:
Hi All, I am Emmanuel katto from Uganda. I want to ask what challenges you've faced while migrating a website to cloud. Please let me know. Thanks! Emmanuel
0
BarryA
by: BarryA | last post by:
What are the essential steps and strategies outlined in the Data Structures and Algorithms (DSA) roadmap for aspiring data scientists? How can individuals effectively utilize this roadmap to progress...
1
by: nemocccc | last post by:
hello, everyone, I want to develop a software for my android phone for daily needs, any suggestions?
1
by: Sonnysonu | last post by:
This is the data of csv file 1 2 3 1 2 3 1 2 3 1 2 3 2 3 2 3 3 the lengths should be different i have to store the data by column-wise with in the specific length. suppose the i have to...
0
by: Hystou | last post by:
There are some requirements for setting up RAID: 1. The motherboard and BIOS support RAID configuration. 2. The motherboard has 2 or more available SATA protocol SSD/HDD slots (including MSATA, M.2...
0
by: Hystou | last post by:
Most computers default to English, but sometimes we require a different language, especially when relocating. Forgot to request a specific language before your computer shipped? No problem! You can...
0
Oralloy
by: Oralloy | last post by:
Hello folks, I am unable to find appropriate documentation on the type promotion of bit-fields when using the generalised comparison operator "<=>". The problem is that using the GNU compilers,...

By using Bytes.com and it's services, you agree to our Privacy Policy and Terms of Use.

To disable or enable advertisements and analytics tracking please visit the manage ads & tracking page.