472,951 Members | 1,696 Online

# A question about sort stable

a sort which is stable means it keeps the object's original order. A sort
which is in place is stable. does anyone can explain me what is sort in
place?
thx.
Mar 29 '07 #1
30 2658
gaoxtwarrior wrote:
a sort which is stable means it keeps the object's original order. A sort
which is in place is stable. does anyone can explain me what is sort in
place?
It means no significant additional storage is used while sorting.

Mar 29 '07 #2

"gaoxtwarrior" <ga***@cssweb.com.cnwrote in message
>a sort which is stable means it keeps the object's original order. A sort
which is in place is stable. does anyone can explain me what is sort in
place?
Your premise is wrong. qsort() is an sort in place - if you print out the
array as it is sorting you will see values moving up and down within it -
but it is not stable. If Fred fred freddie and Freda all compare equal
because you are doing a case-insensitive comparision of the first four
letters, they will be contigous but in a random order when the sort
finished. A stable sort preserve the order of equal keys.
You can turn qsort into a stable sort by using the index as a tie breaker.
Mar 29 '07 #3
Malcolm McLean wrote On 03/29/07 15:28,:
"gaoxtwarrior" <ga***@cssweb.com.cnwrote in message
>>a sort which is stable means it keeps the object's original order. A sort
which is in place is stable. does anyone can explain me what is sort in
place?

Your premise is wrong. qsort() is an sort in place - if you print out the
array as it is sorting you will see values moving up and down within it -
but it is not stable. If Fred fred freddie and Freda all compare equal
because you are doing a case-insensitive comparision of the first four
letters, they will be contigous but in a random order when the sort
finished. A stable sort preserve the order of equal keys.
You can turn qsort into a stable sort by using the index as a tie breaker.
Just to be clear: That's the "original" index, not the
index that the item happens to occupy at some random moment
during the sort.

--
Er*********@sun.com
Mar 29 '07 #4
On Thu, 29 Mar 2007 20:28:35 +0100, "Malcolm McLean"
<re*******@btinternet.comwrote:
>
"gaoxtwarrior" <ga***@cssweb.com.cnwrote in message
>>a sort which is stable means it keeps the object's original order. A sort
which is in place is stable. does anyone can explain me what is sort in
place?
Your premise is wrong. qsort() is an sort in place - if you print out the
array as it is sorting you will see values moving up and down within it -
but it is not stable. If Fred fred freddie and Freda all compare equal
because you are doing a case-insensitive comparision of the first four
letters, they will be contigous but in a random order when the sort
finished. A stable sort preserve the order of equal keys.
You can turn qsort into a stable sort by using the index as a tie breaker.
Your description applies to quicksort; IIANM qsort is a library routine
that can be implemented in any old way that the implementor feels like
using, including a stable sort.
>
Mar 29 '07 #5
"Malcolm McLean" <re*******@btinternet.comwrites:
"gaoxtwarrior" <ga***@cssweb.com.cnwrote in message
>a sort which is stable means it keeps the object's original order. A
sort which is in place is stable. does anyone can explain me what is
sort in place?
Your premise is wrong. qsort() is an sort in place - if you print out
the array as it is sorting you will see values moving up and down
within it -
but it is not stable.
[...]

I think you're talking about the Quicksort algorithm. The C qsort()
function, declared in <stdlib.h>, may or may not use Quicksort; it's
allowed to use a either a stable or an unstable algorithm.

(I think there are also stable variants of Quicksort, but I don't know
the details.)

--
Keith Thompson (The_Other_Keith) ks***@mib.org <http://www.ghoti.net/~kst>
San Diego Supercomputer Center <* <http://users.sdsc.edu/~kst>
"We must do something. This is something. Therefore, we must do this."
-- Antony Jay and Jonathan Lynn, "Yes Minister"
Mar 29 '07 #6
In article <ln************@nuthaus.mib.org>,
Keith Thompson <ks***@mib.orgwrote:
>>a sort which is stable means it keeps the object's original order. A
sort which is in place is stable. does anyone can explain me what is
sort in place?
>Your premise is wrong. qsort() is an sort in place - if you print out
the array as it is sorting you will see values moving up and down
within it -
but it is not stable.
>I think you're talking about the Quicksort algorithm. The C qsort()
function, declared in <stdlib.h>, may or may not use Quicksort; it's
allowed to use a either a stable or an unstable algorithm.
The other part is true though: qsort() does sort in place, as far as
the input and output are concerned. And qsort() is not guaranteed to
be stable, and isn't in many implementations, so it serves as an
example of an in-place sort which is not necessarily stable.

-- Richard

--
"Consideration shall be given to the need for as many as 32 characters
in some alphabets" - X3.4, 1963.
Mar 29 '07 #7
santosh <sa*********@gmail.comwrote:
gaoxtwarrior wrote:
>a sort which is stable means it keeps the object's original order. A sort
which is in place is stable. does anyone can explain me what is sort in
place?

It means no significant additional storage is used while sorting.
Which means it has nothing to do with stability. An in-place sort need
not be stable. In fact, the standard library function qsort() is an
in-place sort, but it's not guaranteed to be stable and most
implementations are not.

-Larry Jones

It's not denial. I'm just very selective about the reality I accept.
-- Calvin
Mar 29 '07 #8
santosh wrote:
gaoxtwarrior wrote:
>a sort which is stable means it keeps the object's original order.
A sort which is in place is stable. does anyone can explain me
what is sort in place?

It means no significant additional storage is used while sorting.
Nonsense on the stable. It means items which sort equal appear in
the original order.

--
Chuck F (cbfalconer at maineline dot net)
Available for consulting/temporary embedded and systems.
<http://cbfalconer.home.att.net>

--
Posted via a free Usenet account from http://www.teranews.com

Mar 30 '07 #9
Keith Thompson wrote:
>
.... snip ...
>
(I think there are also stable variants of Quicksort, but I don't know
know the details.)
I don't think so, without using auxiliary data (such as indices).
I am prepared to be proven wrong on this.

--
Chuck F (cbfalconer at maineline dot net)
Available for consulting/temporary embedded and systems.
<http://cbfalconer.home.att.net>

--
Posted via a free Usenet account from http://www.teranews.com

Mar 30 '07 #10
On Mar 29, 9:32 pm, CBFalconer <cbfalco...@yahoo.comwrote:
Keith Thompson wrote:

... snip ...
(I think there are also stable variants of Quicksort, but I don't know
know the details.)

I don't think so, without using auxiliary data (such as indices).
I am prepared to be proven wrong on this.
I was reading through some APIs and came across a description of a
function that was implementing a "non-recursive" quicksort. Does this
mean that it doesn't use recursive calls or does it mean that it
doesn't use a stack at all?

--
Ioan - Ciprian Tandau
tandau _at_ freeshell _dot_ org (hope it's not too late)
(... and that it still works...)

Mar 30 '07 #11
Nelu wrote:
CBFalconer <cbfalco...@yahoo.comwrote:
>Keith Thompson wrote:

... snip ...
>>(I think there are also stable variants of Quicksort, but I don't know
know the details.)

I don't think so, without using auxiliary data (such as indices).
I am prepared to be proven wrong on this.

I was reading through some APIs and came across a description of a
function that was implementing a "non-recursive" quicksort. Does this
mean that it doesn't use recursive calls or does it mean that it
doesn't use a stack at all?
No recursion. The stack, or a reasonable facsimile, is needed.

--
Chuck F (cbfalconer at maineline dot net)
Available for consulting/temporary embedded and systems.
<http://cbfalconer.home.att.net>

--
Posted via a free Usenet account from http://www.teranews.com

Mar 30 '07 #12
CBFalconer <cb********@yahoo.comwrites:
Keith Thompson wrote:
... snip ...
>>
(I think there are also stable variants of Quicksort, but I don't know
know the details.)

I don't think so, without using auxiliary data (such as indices).
I am prepared to be proven wrong on this.
A Google search on "stable quicksort" gets a lot of hits. That's
about all I know on the subject.

--
Keith Thompson (The_Other_Keith) ks***@mib.org <http://www.ghoti.net/~kst>
San Diego Supercomputer Center <* <http://users.sdsc.edu/~kst>
"We must do something. This is something. Therefore, we must do this."
-- Antony Jay and Jonathan Lynn, "Yes Minister"
Mar 30 '07 #13
On Mar 30, 12:44 am, CBFalconer <cbfalco...@yahoo.comwrote:
Neluwrote:
CBFalconer <cbfalco...@yahoo.comwrote:
Keith Thompson wrote:
... snip ...
>(I think there are also stable variants of Quicksort, but I don't know
know the details.)
I don't think so, without using auxiliary data (such as indices).
I am prepared to be proven wrong on this.
I was reading through some APIs and came across a description of a
function that was implementing a "non-recursive" quicksort. Does this
mean that it doesn't use recursive calls or does it mean that it
doesn't use a stack at all?

No recursion. The stack, or a reasonable facsimile, is needed.
I thought that maintaining your own stack in an iterative algorithm to
solve a recursively defined problem was also called recursion because
the objects at every step are defined based on the previously saved
objects/states and a recursive call was just a more straightforward
way to define recursion. Is that wrong?

--
Ioan - Ciprian Tandau
tandau _at_ freeshell _dot_ org (hope it's not too late)
(... and that it still works...)

Mar 30 '07 #14
Nelu wrote On 03/30/07 11:08,:
On Mar 30, 12:44 am, CBFalconer <cbfalco...@yahoo.comwrote:
>>Neluwrote:
>>>CBFalconer <cbfalco...@yahoo.comwrote:

Keith Thompson wrote:
>>>>... snip ...
>>>>>(I think there are also stable variants of Quicksort, but I don't know
>know the details.)
>>>>I don't think so, without using auxiliary data (such as indices).
I am prepared to be proven wrong on this.
>>>I was reading through some APIs and came across a description of a
function that was implementing a "non-recursive" quicksort. Does this
mean that it doesn't use recursive calls or does it mean that it
doesn't use a stack at all?

No recursion. The stack, or a reasonable facsimile, is needed.

I thought that maintaining your own stack in an iterative algorithm to
solve a recursively defined problem was also called recursion because
the objects at every step are defined based on the previously saved
objects/states and a recursive call was just a more straightforward
way to define recursion. Is that wrong?
The distinction seems less sharp. It's known, after all,
that recursions can be transformed into iterations and vice
versa, so in a sense they can be seen as equivalent. With
that in mind, trying to label a particular algorithm as one
or the other is something of a deliberate inaccuracy.

Here, for example, is an implementation of Euclid's GCD
algorithm:

unsigned int gcd(unsigned int u, unsigned int v) {
return (v == 0) ? u : gcd(v, u % v);
}

I think pretty much anyone would characterize this implementation
as "recursive." But here's another implementation of the very
same algorithm:

unsigned int gcd(unsigned int u, unsigned int v) {
if (v 0) {
unsigned int r;
while ((r = u % v) != 0) {
u = v;
v = r;
}
}
return u;
}

.... which I think you will agree is "iterative." And yet,
it performs the same calculations and the same tests as the
first! Only the expression has changed, not the algorithm
itself, so I don't think it makes sense to be dogmatic about
saying "This algorithm is recursive!" or "This algorithm is
iterative!"

--
Er*********@sun.com
Mar 30 '07 #15
On Mar 30, 11:22 am, Eric Sosman <Eric.Sos...@sun.comwrote:
Nelu wrote On 03/30/07 11:08,:
On Mar 30, 12:44 am, CBFalconer <cbfalco...@yahoo.comwrote:
>Neluwrote:
>>CBFalconer <cbfalco...@yahoo.comwrote:
>>>Keith Thompson wrote:
>>>... snip ...
>>>>(I think there are also stable variants of Quicksort, but I don't know
know the details.)
>>>I don't think so, without using auxiliary data (such as indices).
I am prepared to be proven wrong on this.
>>I was reading through some APIs and came across a description of a
function that was implementing a "non-recursive" quicksort. Does this
mean that it doesn't use recursive calls or does it mean that it
doesn't use a stack at all?
>No recursion. The stack, or a reasonable facsimile, is needed.
I thought that maintaining your own stack in an iterative algorithm to
solve a recursively defined problem was also called recursion because
the objects at every step are defined based on the previously saved
objects/states and a recursive call was just a more straightforward
way to define recursion. Is that wrong?

The distinction seems less sharp. It's known, after all,
that recursions can be transformed into iterations and vice
versa, so in a sense they can be seen as equivalent. With
that in mind, trying to label a particular algorithm as one
or the other is something of a deliberate inaccuracy.

Here, for example, is an implementation of Euclid's GCD
algorithm:

unsigned int gcd(unsigned int u, unsigned int v) {
return (v == 0) ? u : gcd(v, u % v);
}

I think pretty much anyone would characterize this implementation
as "recursive." But here's another implementation of the very
same algorithm:

unsigned int gcd(unsigned int u, unsigned int v) {
if (v 0) {
unsigned int r;
while ((r = u % v) != 0) {
u = v;
v = r;
}
}
return u;
}

... which I think you will agree is "iterative." And yet,
it performs the same calculations and the same tests as the
first! Only the expression has changed, not the algorithm
itself, so I don't think it makes sense to be dogmatic about
saying "This algorithm is recursive!" or "This algorithm is
iterative!"
Well, yes and no. In your example (iteration vs. tail recursion) there
is a recursive call but the algorithm does not need to save the state
of its objects. I mean you can make an iteration into a recursive call
but a recursive call may need something extra to be transformed into
an iteration, i.e. tail recursion solution <-iterative solution;
recursion ->iterative + state stack solution.
I guess my question was whether an iteration that needs to save the
state of it's objects at every step (or in a number of steps
proportional to the number of iteration steps) is a recursion or not.

--
Ioan - Ciprian Tandau
tandau _at_ freeshell _dot_ org (hope it's not too late)
(... and that it still works...)

Mar 30 '07 #16
Nelu wrote On 03/30/07 11:43,:
>
Well, yes and no. In your example (iteration vs. tail recursion) there
is a recursive call but the algorithm does not need to save the state
of its objects. I mean you can make an iteration into a recursive call
but a recursive call may need something extra to be transformed into
an iteration, i.e. tail recursion solution <-iterative solution;
recursion ->iterative + state stack solution.
I guess my question was whether an iteration that needs to save the
state of it's objects at every step (or in a number of steps
proportional to the number of iteration steps) is a recursion or not.
Somehow I *knew* somebody would mention the magic mantra
of "tail recursion." As if it made a difference ...

All right, if tail recursion bothers you as being somehow
non-recursive, here's the recursion example I hate the most
(because nobody in his right mind would write this code):

long double fib(unsigned int n) {
return n 1 ? fib(n-1) + fib(n-2) : n 0;
}

.... which I think you'll agree is recursive, and not tail-
recursive. And here's an iterative expression of the exact
same idea, just done differently (and far more efficiently):

long double fib(unsigned int n) {
long double g = 0, h = 1;
while (n-- 0) {
long double s = g + h;
g = h;
h = s;
}
return g;
}

Both implementations rely on forming a Fibonacci number as
the sum of the two that precede it; the crux of the method
is the same in both cases even though the implementations
are vastly different. So: Is the computation of Fibonacci
numbers recursive or iterative? My claim is that you can't
assign either label to the algorithm and have the claim
stand beyond debate. It make sense to label implementations
as recursive or iterative, but one is on shaky ground when
saying that a non-recursive implementation is "recursion
in disguise." It can be a false dichotomy.

(There is, of course, a way to compute Fibonacci numbers
without recursion and without iteration, if you grant yourself
the ability to compute powers. Difficult to do accurately
for large n, though.)

(The theory of automata uses "recursively computable"
to cover some of the things we're discussing here, but it's
really not the same notion. It covers a lot of things that
programmers wouldn't think of as recursive: `++i', for
example, is a recursively computable function.)

--
Er*********@sun.com
Mar 30 '07 #17
On Mar 30, 12:23 pm, Eric Sosman <Eric.Sos...@sun.comwrote:
Nelu wrote On 03/30/07 11:43,:
Well, yes and no. In your example (iteration vs. tail recursion) there
is a recursive call but the algorithm does not need to save the state
of its objects. I mean you can make an iteration into a recursive call
but a recursive call may need something extra to be transformed into
an iteration, i.e. tail recursion solution <-iterative solution;
recursion ->iterative + state stack solution.
I guess my question was whether an iteration that needs to save the
state of it's objects at every step (or in a number of steps
proportional to the number of iteration steps) is a recursion or not.

Somehow I *knew* somebody would mention the magic mantra
of "tail recursion." As if it made a difference ...

All right, if tail recursion bothers you as being somehow
non-recursive, here's the recursion example I hate the most
(because nobody in his right mind would write this code):

long double fib(unsigned int n) {
return n 1 ? fib(n-1) + fib(n-2) : n 0;
}
Ok, I'll pretend I am not bothered by the double calculations. :-)
... which I think you'll agree is recursive, and not tail-
recursive.
I do.
And here's an iterative expression of the exact
same idea, just done differently (and far more efficiently):

long double fib(unsigned int n) {
long double g = 0, h = 1;
while (n-- 0) {
long double s = g + h;
g = h;
h = s;
}
return g;
}

Both implementations rely on forming a Fibonacci number as
the sum of the two that precede it; the crux of the method
is the same in both cases even though the implementations
are vastly different. So: Is the computation of Fibonacci
numbers recursive or iterative?
It is recursive in the first case and iterative in the second. The
first case will use a stack to track the status of each call and
you'll have different sizes for the stack at each call. In the second
case, at each iteration step you only track one state throughout the
entire process.
My claim is that you can't
assign either label to the algorithm and have the claim
stand beyond debate. It make sense to label implementations
as recursive or iterative, but one is on shaky ground when
saying that a non-recursive implementation is "recursion
in disguise." It can be a false dichotomy.
What I'm claiming is that if you have an iterative process that uses a
structure (stack or similar) to keep track of the previous states of
the iteration and the size of the structure depends on the number of
iterations steps than that is a recursion.
Your second example doesn't meet this criteria. You have a recursive
implementation and an iterative implementation. However, leaving aside
the double computation in the first example so I don't have to use two
stacks and think too much :-), an equivalent non-recursive call
implementation could be this:
long fib(unsigned int n) {
struct vector s;
int i;
long res;
for(i=1;i<n;i++) {
}
res=n==0?1:vector_get(s,n-1);
vector_clear(s);
return res;
}
which, I think, is a recursive implementation that doesn't involve a
recursive call because it replaces the call by maintaining the stack/
vector itself.

Your examples show that starting from a recursive definition you may
be able to find an algorithm that can take shortcuts to avoid
recursion but the algorithm is not equivalent, in resource
consumption, to the recursive one, which does extra work (even if
behind the scenes) and requires extra memory.
In the case of quicksort there is no way (at least I don't know of any
way) of implementing an algorithm that uses a fixed number of states
that do not depend on the length of the array that needs to be sorted.

--
Ioan - Ciprian Tandau
tandau _at_ freeshell _dot_ org (hope it's not too late)
(... and that it still works...)

Mar 30 '07 #18
"Richard Harter" <cr*@tiac.netwrote in message
On Thu, 29 Mar 2007 20:28:35 +0100, "Malcolm McLean"
<re*******@btinternet.comwrote:
>>
"gaoxtwarrior" <ga***@cssweb.com.cnwrote in message
>>>a sort which is stable means it keeps the object's original order. A sort
which is in place is stable. does anyone can explain me what is sort in
place?
Your premise is wrong. qsort() is an sort in place - if you print out the
array as it is sorting you will see values moving up and down within it -
but it is not stable. If Fred fred freddie and Freda all compare equal
because you are doing a case-insensitive comparision of the first four
letters, they will be contigous but in a random order when the sort
finished. A stable sort preserve the order of equal keys.
You can turn qsort into a stable sort by using the index as a tie breaker.

Your description applies to quicksort; IIANM qsort is a library routine
that can be implemented in any old way that the implementor feels like
using, including a stable sort.
Strictly you are right. However if a function called qsort() doesn't
implement quicksort then you ought to send it back.

Mar 30 '07 #19
CBFalconer wrote:
Keith Thompson wrote:
.... snip ...
>(I think there are also stable variants of Quicksort, but I don't know
know the details.)

I don't think so, without using auxiliary data (such as indices).
I am prepared to be proven wrong on this.
If, in the comparison routine, one was to compare the pointers if the
other data compare equal that would make it stable, yes?

At least that's my reading of 6.5.8.5
--
Kyle A. York
Sr. Subordinate Grunt
Mar 30 '07 #20
Malcolm McLean wrote, On 30/03/07 22:05:
"Richard Harter" <cr*@tiac.netwrote in message
<snip>
>Your description applies to quicksort; IIANM qsort is a library routine
that can be implemented in any old way that the implementor feels like
using, including a stable sort.
Strictly you are right. However if a function called qsort() doesn't
implement quicksort then you ought to send it back.
Why if it meets the requirements? Especially as in any decent library it
will be at least as good as a "standard" quicksort implementation for
most uses. If you want a specific algorithm, you can always obtain or
write an implementation.
--
Flash Gordon
Mar 30 '07 #21
"Malcolm McLean" <re*******@btinternet.comwrites:
"Richard Harter" <cr*@tiac.netwrote in message
[...]
>Your description applies to quicksort; IIANM qsort is a library routine
that can be implemented in any old way that the implementor feels like
using, including a stable sort.
Strictly you are right. However if a function called qsort() doesn't
implement quicksort then you ought to send it back.
Why? There are plenty of good sorting algorithms other than
Quicksort. Why shouldn't qsort() use one of them?

--
Keith Thompson (The_Other_Keith) ks***@mib.org <http://www.ghoti.net/~kst>
San Diego Supercomputer Center <* <http://users.sdsc.edu/~kst>
"We must do something. This is something. Therefore, we must do this."
-- Antony Jay and Jonathan Lynn, "Yes Minister"
Mar 30 '07 #22
In article <Vt******************************@bt.com>,
Malcolm McLean <re*******@btinternet.comwrote:
>Strictly you are right. However if a function called qsort() doesn't
implement quicksort then you ought to send it back.
It's unfortunate that C's sort function is called qsort(). But most
implementations use something better than vanilla quicksort, and I'd
be more inclined to send it back if it didn't.

-- Richard

--
"Consideration shall be given to the need for as many as 32 characters
in some alphabets" - X3.4, 1963.
Mar 30 '07 #23
Malcolm McLean wrote:
"Richard Harter" <cr*@tiac.netwrote in message
.... snip ...
>>
Your description applies to quicksort; IIANM qsort is a library
routine that can be implemented in any old way that the
implementor feels like using, including a stable sort.

Strictly you are right. However if a function called qsort()
doesn't implement quicksort then you ought to send it back.
Why? Maybe the implementor wants to guarantee the worst case
execution time? Bye bye quicksort.

--
Chuck F (cbfalconer at maineline dot net)
Available for consulting/temporary embedded and systems.
<http://cbfalconer.home.att.net>

--
Posted via a free Usenet account from http://www.teranews.com

Mar 30 '07 #24
On Mar 30, 2:05 pm, "Malcolm McLean" <regniz...@btinternet.comwrote:
[snip]
Strictly you are right. However if a function called qsort() doesn't
implement quicksort then you ought to send it back.- Hide quoted text -
Most modern qsort() functions do not use the quicksort algorithm
(though quicksort is in the bag of tricks).
For most implementations that I have seen, it is a hybrid algorithm
that works (roughly speaking) like this:
If the partition is small, then use insertion sort.
If the partition is large, then subdivide the partition (typical
quicksort method)...
However -- if the number of subidivisions is already at log(n) then we
have a perverse data set, so switch to heapsort.

The idea of using insertion sort when the set is not big is due to
Richard Singleton (ACM algorithm 347).
The idea of using heapsort when the sort goes perverse is called
Introspective sort. Introspective sort is due to David Musser:
http://www.cs.rpi.edu/~musser/gp/introsort.ps

Mar 30 '07 #25
Keith Thompson wrote:
"Malcolm McLean" <re*******@btinternet.comwrites:
>"Richard Harter" <cr*@tiac.netwrote in message
[...]
>>Your description applies to quicksort; IIANM qsort is a library routine
that can be implemented in any old way that the implementor feels like
using, including a stable sort.
Strictly you are right. However if a function called qsort() doesn't
implement quicksort then you ought to send it back.

Why? There are plenty of good sorting algorithms other than
Quicksort. Why shouldn't qsort() use one of them?
Please nominate an in place array sort for qsort() quicker than the
quicksort. I like insertion sorts and especially Shell sorts because
they are simpler. But they are not quicker.

--
Joe Wright
"Everything should be made as simple as possible, but not simpler."
--- Albert Einstein ---
Mar 31 '07 #26
Joe Wright <jo********@comcast.netwrites:
Keith Thompson wrote:
>"Malcolm McLean" <re*******@btinternet.comwrites:
>>"Richard Harter" <cr*@tiac.netwrote in message
[...]
>>>Your description applies to quicksort; IIANM qsort is a library routine
that can be implemented in any old way that the implementor feels like
using, including a stable sort.

Strictly you are right. However if a function called qsort()
doesn't implement quicksort then you ought to send it back.
Why? There are plenty of good sorting algorithms other than
Quicksort. Why shouldn't qsort() use one of them?
Please nominate an in place array sort for qsort() quicker than the
quicksort. I like insertion sorts and especially Shell sorts because
they are simpler. But they are not quicker.
There are a number of O(n log n) sorting algorithms. I haven't
studied them recently enough to make any definitive statements, but
Mergesort and Heapsort are two examples.

Naive Quicksort is O(n**2), but that's not hard to correct.

--
Keith Thompson (The_Other_Keith) ks***@mib.org <http://www.ghoti.net/~kst>
San Diego Supercomputer Center <* <http://users.sdsc.edu/~kst>
"We must do something. This is something. Therefore, we must do this."
-- Antony Jay and Jonathan Lynn, "Yes Minister"
Mar 31 '07 #27
Keith Thompson wrote:
Joe Wright <jo********@comcast.netwrites:
.... snip ...
>
>Please nominate an in place array sort for qsort() quicker than
the quicksort. I like insertion sorts and especially Shell sorts
because they are simpler. But they are not quicker.

There are a number of O(n log n) sorting algorithms. I haven't
studied them recently enough to make any definitive statements,
but Mergesort and Heapsort are two examples.

Naive Quicksort is O(n**2), but that's not hard to correct.
It is impossible for all possible inputs. There is always a worst
case.

And the optimum depends highly on the size of the sortee. For
example, sorting the array a = [2, 1] involves only one comparison
and one-half (avg) exchange. Setting up and tearing down a more
sophisticated sort takes time and effort.

--
Chuck F (cbfalconer at maineline dot net)
Available for consulting/temporary embedded and systems.
<http://cbfalconer.home.att.net>
--
Posted via a free Usenet account from http://www.teranews.com

Mar 31 '07 #28

"kyle york" <ky***@cisco.comwrote in message
CBFalconer wrote:
>Keith Thompson wrote:
.... snip ...
>>(I think there are also stable variants of Quicksort, but I don't know
know the details.)

I don't think so, without using auxiliary data (such as indices). I am
prepared to be proven wrong on this.

If, in the comparison routine, one was to compare the pointers if the
other data compare equal that would make it stable, yes?

At least that's my reading of 6.5.8.5

You can't compare the pointers passed to you by qsort() because the way it
works is to randomly select one item as a "pivot", move it to the lowest
place, and then run through the array putting all the items above the pivot
to the top and below the pivot to the bottom. The pivot is always the top of
the "bottom" heap. The pivot is now in the right place and you have two
unsorted arrays above and below it. So it then calls itself recursively on
those two arrays until the array length goes to one.

Here's a version I wrote. bubblesort() and swap() do what you would expect.
I can't see an easy way to undo the swapping of data items to make the sort
stable, without gobbling lots of extra memory.

/*
quicksort - quicksort algorithm
Params: ptr - base of array
N - number of items
sz - width of each item
comp - comparision function
*/
void quicksort(void *ptr, int N, int sz, int (*comp)(const void *e1, const
void *e2))
{
int pivot = 0;
int left = 1;
int right = N;
unsigned char *xptr = ptr;

if(N < 8)
{
bubblesort(ptr, N, sz, comp);
return;
}
else
{
swap(xptr, xptr + (N/2) * sz, sz);
while(left < right)
{
if ( (*comp)(xptr + left * sz, xptr + pivot * sz) <= 0)
{
left++;
}
else
{
right--;
swap(xptr + left * sz, xptr + right * sz, sz);
}
}
left--;
swap(xptr, xptr + left * sz, sz);
quicksort(xptr, left, sz, comp);
quicksort(xptr + (left + 1) * sz, N - left - 1, sz, comp);
}
}
--
Free games and programming goodies.
http://www.personal.leeds.ac.uk/~bgy1mm

Mar 31 '07 #29
On Sat, 31 Mar 2007 00:25:52 -0500, CBFalconer <cb********@yahoo.com>
wrote:
>Keith Thompson wrote:
>Joe Wright <jo********@comcast.netwrites:
... snip ...
>>
>>Please nominate an in place array sort for qsort() quicker than
the quicksort. I like insertion sorts and especially Shell sorts
because they are simpler. But they are not quicker.

There are a number of O(n log n) sorting algorithms. I haven't
studied them recently enough to make any definitive statements,
but Mergesort and Heapsort are two examples.

Naive Quicksort is O(n**2), but that's not hard to correct.

It is impossible for all possible inputs. There is always a worst
case.
This isn't quite true. The thing is, quicksort is actually a family of
algorithmts. In pseudo code the algorithm is:

function quicksort(data)
choose a pivot
partition data into left and right
sort left
sort right

There are various details and fiddles, e.g., fat pivot, sort the smaller
partition first, etc., but the thing that determines whether a
particular implmentation is worst case O(n**2) is how you choose the
pivot. Rather generally, the cost of choosing a "good" pivot is O(n),
where a "good" pivot choice algorithm is one that guarantees that the
quicksort implementation worst case performance is O(n* log n).

The difficulty is that the cure is ususally worse than the disease.
>
And the optimum depends highly on the size of the sortee. For
example, sorting the array a = [2, 1] involves only one comparison
and one-half (avg) exchange. Setting up and tearing down a more
sophisticated sort takes time and effort.

--
Chuck F (cbfalconer at maineline dot net)
Available for consulting/temporary embedded and systems.
<http://cbfalconer.home.att.net>
--
Posted via a free Usenet account from http://www.teranews.com
Mar 31 '07 #30
On Mar 30, 7:07 pm, Joe Wright <joewwri...@comcast.netwrote:
Keith Thompson wrote:
"Malcolm McLean" <regniz...@btinternet.comwrites:
"Richard Harter" <c...@tiac.netwrote in message
[...]
>Your description applies to quicksort; IIANM qsort is a library routine
that can be implemented in any old way that the implementor feels like
using, including a stable sort.
Strictly you are right. However if a function called qsort() doesn't
implement quicksort then you ought to send it back.
Why? There are plenty of good sorting algorithms other than
Quicksort. Why shouldn't qsort() use one of them?

Please nominate an in place array sort for qsort() quicker than the
quicksort. I like insertion sorts and especially Shell sorts because
they are simpler. But they are not quicker.
Weak heapsort is faster (but requires N additional bits of storage,
and therefore is not totally 'in place')

Related, very intersting paper:
"Implementing HEAPSORT with n log n -0.9n and QUICKSORT with n log n +
0.2n Comparisons"
Stefan Edelkamp, Patrick Stiegeler
You can find the source code for this project here:
http://www.jea.acm.org/CODE/Vol7Nbr5.tar

For integers, you can sort in n* log(log(n))

These look interesting:
"A Heap-Based Optimal Inversions-Sensitive Sorting Algorithm"
Srinath Sridhar

The papers on generalized quicksort and on presorting also look very
interesting.

Of course, for multiple CPU/multiple disk systems, everything changes
and some other algorithms dominate.

Apr 4 '07 #31

This thread has been closed and replies have been disabled. Please start a new discussion.