I was looking at some old posts in comp.lang.c and found the following http://groups.google.com/group/comp....b5046326994e18
I have some questions regarding the post.
First, and I quote
"Given an algorithm with loops or recursive calls, the way you find
a bigO equivalence class for that algorithm is to write down a
"recurrence relation" for the time taken to execute the code. For
instance, for merge sort, one gets (fixed width font required here;
$x$ denotes a variable x, etc.):
{
T(n) = { $a$, $n = 1$, $a$ a constant
{ $2T(n/2) + cn$, $n 1$, $c$ a constant
You then apply inductiontype reasoning to show that, e.g., when
$n$ is a power of two, $T(n) = an + cn \log_2 n$. In bigO notation,
all the constant factors vanish, so this shows that mergesort is
O(n log n). "
First, how do you get this recurrence relation for mergesort? Second,
how did he get O(n log n)
And how, later on
"(Personall y, I always found the worst part of dealing with
recurrence
relations to be going from an open form to a closed one  you just
have to have memorized all those formulae like "1 + 2 + ... + n =
n(n+1)/2", etc., and recognize them in the recurrences. Once you
can see this summation and recognize the closed form, it instantly
becomes obvious that, e.g., bubble sort is O(n*n).) "
I don't see how O(n*n) can be derived from 1 + 2 + ... + n = n(n+1)/2
" 5 1836
On Apr 25, 8:27 am, Chad <cdal...@gmail. comwrote:
I was looking at some old posts in comp.lang.c and found the following
http://groups.google.com/group/comp....d/thread/d26ab...
I have some questions regarding the post.
First, and I quote
"Given an algorithm with loops or recursive calls, the way you find
a bigO equivalence class for that algorithm is to write down a
"recurrence relation" for the time taken to execute the code. For
instance, for merge sort, one gets (fixed width font required here;
$x$ denotes a variable x, etc.):
{
T(n) = { $a$, $n = 1$, $a$ a constant
{ $2T(n/2) + cn$, $n 1$, $c$ a constant
You then apply inductiontype reasoning to show that, e.g., when
$n$ is a power of two, $T(n) = an + cn \log_2 n$. In bigO notation,
all the constant factors vanish, so this shows that mergesort is
O(n log n). "
First, how do you get this recurrence relation for mergesort?
A merge sort of two or more elements is performed by doing two merge
sorts of half the list (top, then bottom) then merging the list.
Merging the list can be done by examining each element of each of the
list and doing a single comparison and move. There are at most (n1)
compares, and n moves. So if we are counting comparisons or moves we
get something like:
X(n) <= 2 * X(n/2) + c1 * n
(2 merges of half the list, then a merge of the two half sized lists
to an output of the whole starting list.) For one element, X(1) =
c2. So if we just take a minimal upper bound on the number of
operations T(n) >= X(n), then we get:
T(n) = 2 * T(n/2) + c1 * n, T(1) = c2.
[...] Second, how did he get O(n log n) [...]
Let us suppose that n = 2**(i+1), or i = lg_2(n)1 ( = lg_2(n/2)) .
We can form the following telescoping sum:
T( n)  2 * T(n/2 ) = c1 * (n ) = c1 * n
2*T(n/2)  4 * T(n/4 ) = 2 * c1 * (n / 2) = c1 * n
4*T(n/4)  8 * T(n/8 ) = 4 * c1 * (n / 4) = c1 * n
8*T(n/8)  16 * T(n/16) = 8 * c1 * (n / 8) = c1 * n
...
(2**i)*T(2)  (2**(i+1))*T(1) = c1 * n
Then summing vertically we get:
T(n)  (2**(i+1))*T(1) = (i+1) * c1 * n
or:
T(n) = n * c2 + (lg_2(n)) * c1 * n
We have only derived this for n as a power of two, but we can back
substitute to verify that it is correct in general:
(T(n)  c1 * n) / 2
= (n * c2 + (lg_2(n)) * c1 * n  c1 * n) / 2
= (n * c2 + (lg_2(n)1) * c1 * n) / 2
= (n/2) * c2 + (lg_2(n)  1) * c1 * (n / 2)
= (n/2) * c2 + (lg_2(n/2)) * c1 * (n / 2)
= T(n/2)
So we are set. In any event the dominant term with respect to n is
(lg_2(n)) * c1 * n so we can conclude that T(n) is O(n*ln(n)).
And how, later on
"(Personall y, I always found the worst part of dealing with
recurrence relations to be going from an open form to a closed one
 you just have to have memorized all those formulae like "1 + 2
+ ... + n = n(n+1)/2", etc., and recognize them in the recurrences.
Once you can see this summation and recognize the closed form, it
instantly becomes obvious that, e.g., bubble sort is O(n*n).) "
Whatever; this is just basic mathematical skill.
I don't see how O(n*n) can be derived from 1 + 2 + ... + n = n(n+1)/2
n(n+1)/2 = n**2/2 + n/2 which is dominated by n**2/2. More rigorously:
lim{n>inf} ( (n*(n+1)/2) / (n**2/2) ) = 1
which just brings us back to the basic definition of O(f(n)).

Paul Hsieh http://www.pobox.com/~qed/ http://bstring.sf.net/
On Apr 25, 8:27*am, Chad <cdal...@gmail. comwrote:
I was looking at some old posts in comp.lang.c and found the following
http://groups.google.com/group/comp....d/thread/d26ab...
I have some questions regarding the post.
First, and I quote
"Given an algorithm with loops or recursive calls, the way you find
a bigO equivalence class for that algorithm is to write down a
"recurrence relation" for the time taken to execute the code. *For
instance, for merge sort, one gets (fixed width font required here;
$x$ denotes a variable x, etc.):
* * * * * * * * {
* * * * T(n) = *{ $a$, * * * * * * *$n = 1$, $a$ a constant
* * * * * * * * { $2T(n/2) + cn$, * $n 1$, $c$ a constant
You then apply inductiontype reasoning to show that, e.g., when
$n$ is a power of two, $T(n) = an + cn \log_2 n$. *In bigO notation,
all the constant factors vanish, so this shows that mergesort is
O(n log n). "
First, how do you get this recurrence relation for mergesort? Second,
how did he get O(n log n)
And how, later on
"(Personall y, I always found the worst part of dealing with
recurrence
relations to be going from an open form to a closed one  you just
have to have memorized all those formulae like "1 + 2 + ... + n =
n(n+1)/2", etc., and recognize them in the recurrences. *Once you
can see this summation and recognize the closed form, it instantly
becomes obvious that, e.g., bubble sort is O(n*n).) "
I don't see how O(n*n) can be derived from 1 + 2 + ... + n = *n(n+1)/2
"
Actually, now that I think about it, I have no idea how one gets the
following summation
1 + 2 + 3 +....n
for bubble sort.
Chad said:
<snip>
>> I don't see how O(n*n) can be derived from 1 + 2 + ... + n = n(n+1)/2
n(n+1)/2 is (n*n + n)/2. To get O(n*n) from this, bear in mind that n grows
insignificantly compared to n*n as n increases, and bigO is a sort of
bigpicture measure that loses nittygritty detail such as n in the face
of something big and obvious like n*n. So we're down to (n*n)/2. To get
from there to n*n, simply buy a computer that runs at half the speed. (In
other words, constant factors aren't terribly interesting when compared to
the rate of growth of n  they don't change the /shape/ of the algorithmic
complexity.)
Actually, now that I think about it, I have no idea how one gets the
following summation
1 + 2 + 3 +....n
for bubble sort.
It's actually 1 + 2 + 3 + ... n1
Consider { 6, 5, 4, 3, 2, 1 }
To sort this sixelement array using bubble sort, you bubble the biggest
element to the right (five comparisons and swaps), and then you have an
unsorted fiveelement array and one sorted item. To sort the fiveelement
array using bubble sort, you bubble the biggest element to the right (four
comparisons and swaps), and then you have an unsorted fourelement array
and two sorted items. To sort the fourelement array using bubble sort,
you bubble the biggest element to the right (three comparisons and swaps),
and then you have an unsorted threeelement array and three sorted items.
And so on. 5 + 4 + 3 + 2 + 1. It sums to (n1)(n2)/2, which is O(n*n).

Richard Heathfield <http://www.cpax.org.uk >
Email: http://www. +rjh@
Google users: <http://www.cpax.org.uk/prg/writings/googly.php>
"Usenet is a strange place"  dmr 29 July 1999
On Apr 25, 7:16*pm, Richard Heathfield <r...@see.sig.i nvalidwrote:
Chad said:
<snip>
I don't see how O(n*n) can be derived from 1 + 2 + ... + n = *n(n+1)/2
n(n+1)/2 is (n*n + n)/2. To get O(n*n) from this, bear in mind that n grows
insignificantly compared to n*n as n increases, and bigO is a sort of
bigpicture measure that loses nittygritty detail such as n in the face
of something big and obvious like n*n. So we're down to (n*n)/2. To get
from there to n*n, simply buy a computer that runs at half the speed. (In
other words, constant factors aren't terribly interesting when compared to
the rate of growth of n  they don't change the /shape/ of the algorithmic
complexity.)
Actually, now that I think about it, I have no idea how one gets the
following summation
1 + 2 + 3 +....n
for bubble sort.
It's actually 1 + 2 + 3 + ... n1
Consider { 6, 5, 4, 3, 2, 1 }
To sort this sixelement array using bubble sort, you bubble the biggest
element to the right (five comparisons and swaps), and then you have an
unsorted fiveelement array and one sorted item. To sort the fiveelement
array using bubble sort, you bubble the biggest element to the right (four
comparisons and swaps), and then you have an unsorted fourelement array
and two sorted items. To sort the fourelement array using bubble sort,
you bubble the biggest element to the right (three comparisons and swaps),
and then you have an unsorted threeelement array and three sorted items.
And so on. 5 + 4 + 3 + 2 + 1. It sums to (n1)(n2)/2, which is O(n*n).
Thank you for that clarification. Now, one last question. When Paul
sums up
T( n)  2 * T(n/2 ) = c1 * (n ) = c1 * n
2*T(n/2)  4 * T(n/4 ) = 2 * c1 * (n / 2) = c1 * n
4*T(n/4)  8 * T(n/8 ) = 4 * c1 * (n / 4) = c1 * n
8*T(n/8)  16 * T(n/16) = 8 * c1 * (n / 8) = c1 * n
...
He gets
T(n)  (2**(i+1))*T(1) = (i+1) * c1 * n
I don't see where he gets T(1) from. Maybe to get some insight into my
confusion, he has
T(n) = 2 * T(n/2) + c1 * n, T(1) = c2
I also don't see why he would introduce a second constant called c2
for T(1).
On Apr 25, 7:27*pm, Chad <cdal...@gmail. comwrote:
On Apr 25, 7:16*pm, Richard Heathfield <r...@see.sig.i nvalidwrote:
Chad said:
<snip>
>I don't see how O(n*n) can be derived from 1 + 2 + ... + n = *n(n+1)/2
n(n+1)/2 is (n*n + n)/2. To get O(n*n) from this, bear in mind that n grows
insignificantly compared to n*n as n increases, and bigO is a sort of
bigpicture measure that loses nittygritty detail such as n in the face
of something big and obvious like n*n. So we're down to (n*n)/2. To get
from there to n*n, simply buy a computer that runs at half the speed. (In
other words, constant factors aren't terribly interesting when compared to
the rate of growth of n  they don't change the /shape/ of the algorithmic
complexity.)
Actually, now that I think about it, I have no idea how one gets the
following summation
1 + 2 + 3 +....n
for bubble sort.
It's actually 1 + 2 + 3 + ... n1
Consider { 6, 5, 4, 3, 2, 1 }
To sort this sixelement array using bubble sort, you bubble the biggest
element to the right (five comparisons and swaps), and then you have an
unsorted fiveelement array and one sorted item. To sort the fiveelement
array using bubble sort, you bubble the biggest element to the right (four
comparisons and swaps), and then you have an unsorted fourelement array
and two sorted items. To sort the fourelement array using bubble sort,
you bubble the biggest element to the right (three comparisons and swaps),
and then you have an unsorted threeelement array and three sorted items..
And so on. 5 + 4 + 3 + 2 + 1. It sums to (n1)(n2)/2, which is O(n*n).
Thank you for that clarification. Now, one last question. When Paul
sums up
* *T( *n)  *2 * T(n/2 ) = * * c1 * (n * *) = c1 * n
* *2*T(n/2)  *4 * T(n/4 ) = 2 * c1 * (n / 2) = c1 * n
* *4*T(n/4)  *8 * T(n/8 ) = 4 * c1 * (n / 4) = c1 * n
* *8*T(n/8)  16 * T(n/16) = 8 * c1 * (n / 8) = c1 * n
* *...
He gets
T(n)  (2**(i+1))*T(1) = (i+1) * c1 * n
I don't see where he gets T(1) from. Maybe to get some insight into my
confusion, he has
T(n) *= 2 * T(n/2) + c1 * n, T(1) = c2
I also don't see why he would introduce a second constant called c2
for T(1). Hide quoted text 
 Show quoted text 
Okay, I just sat and thought about how the merge sort derivation Paul
did and now I see how his calculation works. This thread has been closed and replies have been disabled. Please start a new discussion. Similar topics 
by: HERC777 
last post by:
just paste the below into notepad, call it iqsort.html double click it
and your IE browser will run it. (neat! a universal programming
language we can all run!)
it falls out from the sort procedure. I got javascript quicksort
running, it's quick but it gets stack overflow after about 5000
elements.
Am I passing the array as a parameter alright? I think javascript
default is to pass pointers, that can be global.

by: Jamal 
last post by:
I am working on binary files of struct ACTIONS
I have a recursive qsort/mergesort hybrid that
1) i'm not a 100% sure works correctly
2) would like to convert to iteration
Any comments or suggestion for improvements
or conversion to iteration would be much appreciated

by: Jamal 
last post by:
I am working on binary files of struct ACTIONS
I have a recursive qsort/mergesort hybrid that
1) i'm not a 100% sure works correctly
2) would like to convert to iteration
Any comments or suggestion for improvements
or conversion to iteration would be much appreciated

by: Robbie Hatley 
last post by:
Hello, group. I've been doing too much C++ programming lately, and
I'm starting to become rusty at some aspects of the C way of doing
things, esp. efficient lowlevel data copies.
Specificially, I just wrote the following, but I don't know if this
is safe:
void Func(char* left, char* right)
{
chat temp_left = {'\0'};

by: rkk 
last post by:
Hi,
My mergesort program is below. There is a small piece of logical
error/bug in this code which I can't figure out, as a result the output
array isn't completely sorted. Requesting your help to resolve this
problem. Thanks in advance.
#include <stdlib.h>
#include <string.h>
#include <stdio.h>
 
by: Baltazar007 
last post by:
Does anyone know how to make quicksort for single linked list without
using array?
I know that mergesort is maybe better but I need quicksort!!
thnx

by: Joerg Schoen 
last post by:
Hi folks!
Everyone knows how to sort arrays (e. g. quicksort, heapsort etc.)
For linked lists, mergesort is the typical choice.
While I was looking for a optimized implementation of mergesort for
linked lists, I couldn't find one. I read something about Mcilroy's
"Optimistic Merge Sort" and studied some implementation, but they
were for arrays. Does anybody know if Mcilroys optimization is applicable to
truly linked lists at all?

by: subramanian100in 
last post by:
What is meant by stable qsort ?

by: gaoxtwarrior 
last post by:
a sort which is stable means it keeps the object's original order. A sort
which is in place is stable. does anyone can explain me what is sort in
place?
thx.

by: marktang 
last post by:
ONU (Optical Network Unit) is one of the key components for providing highspeed Internet services. Its primary function is to act as an endpoint device located at the user's premises. However, people are often confused as to whether an ONU can Work As a Router. In this blog post, we’ll explore What is ONU, What Is Router, ONU & Router’s main usage, and What is the difference between ONU and Router. Let’s take a closer look !
Part I. Meaning of...

by: Hystou 
last post by:
Most computers default to English, but sometimes we require a different language, especially when relocating. Forgot to request a specific language before your computer shipped? No problem! You can effortlessly switch the default language on Windows 10 without reinstalling. I'll walk you through it.
First, let's disable language synchronization. With a Microsoft account, language settings sync across devices. To prevent any complications,...
 
by: Oralloy 
last post by:
Hello folks,
I am unable to find appropriate documentation on the type promotion of bitfields when using the generalised comparison operator "<=>".
The problem is that using the GNU compilers, it seems that the internal comparison operator "<=>" tries to promote arguments from unsigned to signed.
This is as boiled down as I can make it.
Here is my compilation command:
g++12 std=c++20 Wnarrowing bit_field.cpp
Here is the code in...

by: agi2029 
last post by:
Let's talk about the concept of autonomous AI software engineers and nocode agents. These AIs are designed to manage the entire lifecycle of a software development project—planning, coding, testing, and deployment—without human intervention. Imagine an AI that can take a project description, break it down, write the code, debug it, and then launch it, all on its own....
Now, this would greatly impact the work of software developers. The idea...

by: isladogs 
last post by:
The next Access Europe User Group meeting will be on Wednesday 1 May 2024 starting at 18:00 UK time (6PM UTC+1) and finishing by 19:30 (7.30PM).
In this session, we are pleased to welcome a new presenter, Adolph Dupré who will be discussing some powerful techniques for using class modules.
He will explain when you may want to use classes instead of User Defined Types (UDT). For example, to manage the data in unbound forms.
Adolph will...

by: conductexam 
last post by:
I have .net C# application in which I am extracting data from word file and save it in database particularly. To store word all data as it is I am converting the whole word file firstly in HTML and then checking html paragraph one by one.
At the time of converting from word file to html my equations which are in the word document file was convert into image.
Globals.ThisAddIn.Application.ActiveDocument.Select();...

by: adsilva 
last post by:
A Windows Forms form does not have the event Unload, like VB6. What one acts like?

by: 6302768590 
last post by:
Hai team
i want code for transfer the data from one system to another through IP address by using C# our system has to for every 5mins then we have to update the data what the data is updated we have to send another system
 
by: bsmnconsultancy 
last post by:
In today's digital era, a welldesigned website is crucial for businesses looking to succeed. Whether you're a small business owner or a large corporation in Toronto, having a strong online presence can significantly impact your brand's success. BSMN Consultancy, a leader in Website Development in Toronto offers valuable insights into creating effective websites that not only look great but also perform exceptionally well. In this comprehensive...
 