This is a simple question compared to some of the stuff that gets asked (and
answered) here, but sometimes its easy to over look the simpler things.
I've been working with databases for a few years now, but have no formal
training, so some times you just get on and if it works dont worry about it.
But sometimes I wonder just how it works underneath the skin e.g.
If I have 100000 records in a complicated view and say each record, amongst
other stuff, contains a name and an archived flag represented by a bit,
which are pulled together from individual tables.
Of the 100000 records say there are 25 that have the bit set to 0 and
remainder have the bit set to 1 (for instance '0' this may mean data that is
in the current month and so not archived)
If I'm looking for the name 'fred' which is I know amongst the 25 (current
data), I would use some thing like ' select name from view where name =
'fred' and bit = 0'
So in my own mind I'm thinking well the sensible thing for the database to
do is to discard the mass of records where the archieved bit = 1 and then
only search the remaining 25.
But what I then think is there has to be a time overhead in disgarding the
many thousands I'm not interested in. Does sql have to generate the view for
all the data in the view, process each record in turn, saying I'm not
interested in this one, or this one etc and then just pulling out the ones
it is interested in to a second list , then processing these in the second
list.
Or does it process the relevant sub tables first i.e finding the matching
record for fred and the matching ones for bit = 0 and then applying the
rules for the view afterwards.
Or ( as I suspect) does it do it a much cleverer way ??
thanks for any help
Andy