472,336 Members | 1,275 Online

# Big oh Notation

Does anyone know of a good site where I can learn of this?
Jul 17 '05 #1
6 10676
Big O is pretty simple. The O stands for "the Order of" and it shows the
general number of comparisons needed in a sort algorithm. For example, the
average number of comparisons for a selection sort is =N^2/2-N/2. In this
case the N^2 dominates and so the big O notation is O(N^2). For a quicksort
I believe it is =N(log2N) which defined as O(log2 N) because the log2N,
against a large number of items, will dictate the amount of time taken.

Hope this helps a bit.
"jova" <Em***@nospam.net> wrote in message
Does anyone know of a good site where I can learn of this?

Jul 17 '05 #2

news:hS******************@news-server.bigpond.net.au...
Big O is pretty simple. The O stands for "the Order of" and it shows the
general number of comparisons needed in a sort algorithm. For example, the average number of comparisons for a selection sort is =N^2/2-N/2. In this
case the N^2 dominates and so the big O notation is O(N^2). For a quicksort I believe it is =N(log2N) which defined as O(log2 N) because the log2N,
against a large number of items, will dictate the amount of time taken.

Hope this helps a bit.
"jova" <Em***@nospam.net> wrote in message
Does anyone know of a good site where I can learn of this?

If only we had a sorting algorithm of O(log2N) :-) Unfortunately both N and
log2N are significant here (N is more then log2N anyway) so you get O(N
log2N).

And for QS that is average behaviour, worst case is O(N^2) also. For a
sorting algorithm with worst-case O(N log2N) you would need something like
HeapSort.

BTW: Big O notation is used for any algorithm category, not only for sorting
algorithms.

Silvio Bierman
Jul 17 '05 #3
Moth wrote:
Big O is pretty simple. The O stands for "the Order of" and it shows the
general number of comparisons needed in a sort algorithm. For example, the
average number of comparisons for a selection sort is =N^2/2-N/2. In this
case the N^2 dominates and so the big O notation is O(N^2). For a quicksort
I believe it is =N(log2N) which defined as O(log2 N) because the log2N,
against a large number of items, will dictate the amount of time taken.
Oh just a little thing you missed i think. Big O notation is the "Worst
Case" notation. Meaning that the worst case is what is going to happen
after the O. For example, the worst case in picking a number of an
array with a simple for loop is O(n). Because n (the number of
elements) is the worst-case scenario (that the number you want is at the
end).

There are two other notations. Big-Omega (I don't have the omega
symbol, but its like the Ohms symbol if you remember that from
electronics) means the best case scenario. For example, if your
element is at the begining of the array.

Now there is another one that talks is 'tight-bound'. This is your
"average" case and is the best to compare two algorithms. Since an algo
could be excellent, but have one very bad case, O notation only shows
the worst case. But the Big-Theta notation shows the average case.

If you are a beginner you can compare with O notation to algorithms, but
if you have done this before or have some math background, i would
suggest comparing with Big-Theta.

For a good site, its usually mathematics sites that have it. Here our
class book. This is probably what you might be you are looking for.

Cormen, Leiserson, Rivest, and Stein: "Introduction to Algorithms",
McGraw Hill, MIT Press.

Or any good Discreet Math book will have it.

Hope this helps a bit.
"jova" <Em***@nospam.net> wrote in message
Does anyone know of a good site where I can learn of this?

Jul 17 '05 #4
While it was 2/23/04 9:01 PM throughout the UK, Moth sprinkled little
black dots on a white screen, and they fell thus:
Big O is pretty simple. The O stands for "the Order of" and it shows the
general number of comparisons needed in a sort algorithm. For example, the
average number of comparisons for a selection sort is =N^2/2-N/2. In this
case the N^2 dominates and so the big O notation is O(N^2). For a quicksort
I believe it is =N(log2N)
That looks like it means either:

- the logarithm to some unspecified base of 2N
- the function of logarithm to base 2N

I think you mean N log2 N

or if you want to be a little clearer, N * log_2 N
which defined as O(log2 N) because the log2N,
against a large number of items, will dictate the amount of time taken.

Actually, O(N*log N) is an order in its own right. N*log_2 N is far
from asymptotic to log_2 N - try plotting it and you'll see.

The base can be omitted in that particular O expression, as all logs are
the same order. (That doesn't apply to exponentials, nor I think to log
log N.)

There is actually a strict mathematical definition.

f(n) ~ O(g(n)) means that there exists values N and U such that, for all
n > N, f(n) < U*g(n). So in fact, the O notation denotes an upper bound
- any measure that is O(n) is also O(n^2), etc.

f(n) ~ Omega(g(n)) means that there exists values N and L such that, for
all n > N, f(n) > L*g(n).

f(n) ~ Theta(g(n)) means f(n) ~ O(g(n)) && f(n) ~ Omega(g(n)).

Stewart.

--
My e-mail is valid but not my primary mailbox, aside from its being the
unfortunate victim of intensive mail-bombing at the moment. Please keep
replies on the 'group where everyone may benefit.
Jul 17 '05 #5
I learned about Big-Oh from Concrete Mathematics by Graham, Knuth,
Patashnik, excellent book. And also Donald Knuths monumental (well,
three volumes so far) The Art Of Computer Programming.

And by the way, I'm almost positve that QS is of order O(n log n).
(And even if it isn't most people would write it O(n log n) anyway
because

O(n log 2n) = O(n(log 2 + log n)) = O(n log n + n log 2) = O(n log n)

right?)
Jul 17 '05 #6
Oskar Sigvardsson wrote:
I learned about Big-Oh from Concrete Mathematics by Graham, Knuth,
Patashnik, excellent book. And also Donald Knuths monumental (well,
three volumes so far) The Art Of Computer Programming.

And by the way, I'm almost positve that QS is of order O(n log n).
In the average case, yes. It's O(n^2) worst case, O(n) best case.
(And even if it isn't most people would write it O(n log n) anyway
because

O(n log 2n) = O(n(log 2 + log n)) = O(n log n + n log 2) = O(n log n)

Either something is O(n log n) or it isn't. O(n log 2n) is indeed
identical to O(n log n). An order is an order, not the way some
convention chooses to write it down.

Stewart.

--
My e-mail is valid but not my primary mailbox, aside from its being the
unfortunate victim of intensive mail-bombing at the moment. Please keep
replies on the 'group where everyone may benefit.
Jul 17 '05 #7

This thread has been closed and replies have been disabled. Please start a new discussion.