While it was 2/23/04 9:01 PM throughout the UK, Moth sprinkled little

black dots on a white screen, and they fell thus:

Big O is pretty simple. The O stands for "the Order of" and it shows the

general number of comparisons needed in a sort algorithm. For example, the

average number of comparisons for a selection sort is =N^2/2-N/2. In this

case the N^2 dominates and so the big O notation is O(N^2). For a quicksort

I believe it is =N(log2N)

That looks like it means either:

- the logarithm to some unspecified base of 2N

- the function of logarithm to base 2N

I think you mean N log2 N

or if you want to be a little clearer, N * log_2 N

which defined as O(log2 N) because the log2N,

against a large number of items, will dictate the amount of time taken.

Actually, O(N*log N) is an order in its own right. N*log_2 N is far

from asymptotic to log_2 N - try plotting it and you'll see.

The base can be omitted in that particular O expression, as all logs are

the same order. (That doesn't apply to exponentials, nor I think to log

log N.)

There is actually a strict mathematical definition.

f(n) ~ O(g(n)) means that there exists values N and U such that, for all

n > N, f(n) < U*g(n). So in fact, the O notation denotes an upper bound

- any measure that is O(n) is also O(n^2), etc.

f(n) ~ Omega(g(n)) means that there exists values N and L such that, for

all n > N, f(n) > L*g(n).

f(n) ~ Theta(g(n)) means f(n) ~ O(g(n)) && f(n) ~ Omega(g(n)).

Stewart.

--

My e-mail is valid but not my primary mailbox, aside from its being the

unfortunate victim of intensive mail-bombing at the moment. Please keep

replies on the 'group where everyone may benefit.