By using this site, you agree to our updated Privacy Policy and our Terms of Use. Manage your Cookies Settings.
434,660 Members | 1,953 Online
Bytes IT Community
+ Ask a Question
Need help? Post your question and get tips & solutions from a community of 434,660 IT Pros & Developers. It's quick & easy.

SQL native alternative to NZ()?

TheSmileyCoder
Expert Mod 100+
P: 2,321
I remember reading once that in order to optimize queries you should if possible avoid using VBA calls, if there is an alternative.

I have a query which uses a left join to bring in a set of counts. Sometimes this left join will yield some null information, and I want to replace the nulls with zeros, as this query is being used to export to Excel. Is there an alternative to NZ that can be used for such a case?
Apr 30 '12 #1

✓ answered by NeoPa

Nico5038:
The DLOOKUP() and other functions are the true performance killers...
Nico explains the situation very well (of course), however, I would explain that although this is absolutely right (Domain functions are the biggest killers to performance as they embed processing through a dataset), I don't believe it's what is meant by avoiding VBA functions. Let me explain further.

Inbuilt functions, generally, have the least impact. No functions is obviously best, and any call to any function will be likely to cause some impact as it will typically need to be called to process every line of input, which can upset the optimising of a query.

VBA functions, those within a project that are still only semi-compiled, cause further delay. These are what I believe are the subject of what Smiley's heard. These need each separate line of code to be interpreted before being executed. This interpreting can cause many times the delay of running the actual code itself.

Domain functions (Those starting D.) probably have the greatest impact due to having to process through a recordset within the function itself. As all this is outside of the current SQL it is also a process that cannot be incorporated into the optimisations of the SQL itself.

A further point to bear in mind is that the impact of any such call is greatly increased when it occurs within the WHERE clause of the SQL. This seems to stop much of the SQL magic occurring, as functions are fundamentally unpredictable so the optimising has to treat it as a black-box.

PS. One last point to mention, is that functions called within SQL which do not take parameters from the data are assumed to return the same value for each call. These are commensurately run only once, and the value is saved and reused wherever such a call is found. Functions which take a field parameter though, are rerun for every record. It is typically only the latter scenario where the performance hit is to be found.

Share this Question
Share on Google+
6 Replies


nico5038
Expert 2.5K+
P: 3,072
Guess you're out of luck here. The NZ() function is the only option, besides adding rows to the left joined table with a zero value for the missing join key or storing the counts needed with an update query in the table, making the JOIN obsolete.
The UPDATE is only usefull when the table is used multiple times for reporting without the need for a refresh.

B.t.w., the NZ() is a function and not really VBA. I guess Access already has optimized this function to be loaded only once. The DLOOKUP() and other functions are the true performance killers...
Apr 30 '12 #2

NeoPa
Expert Mod 15k+
P: 31,487
Nico5038:
The DLOOKUP() and other functions are the true performance killers...
Nico explains the situation very well (of course), however, I would explain that although this is absolutely right (Domain functions are the biggest killers to performance as they embed processing through a dataset), I don't believe it's what is meant by avoiding VBA functions. Let me explain further.

Inbuilt functions, generally, have the least impact. No functions is obviously best, and any call to any function will be likely to cause some impact as it will typically need to be called to process every line of input, which can upset the optimising of a query.

VBA functions, those within a project that are still only semi-compiled, cause further delay. These are what I believe are the subject of what Smiley's heard. These need each separate line of code to be interpreted before being executed. This interpreting can cause many times the delay of running the actual code itself.

Domain functions (Those starting D.) probably have the greatest impact due to having to process through a recordset within the function itself. As all this is outside of the current SQL it is also a process that cannot be incorporated into the optimisations of the SQL itself.

A further point to bear in mind is that the impact of any such call is greatly increased when it occurs within the WHERE clause of the SQL. This seems to stop much of the SQL magic occurring, as functions are fundamentally unpredictable so the optimising has to treat it as a black-box.

PS. One last point to mention, is that functions called within SQL which do not take parameters from the data are assumed to return the same value for each call. These are commensurately run only once, and the value is saved and reused wherever such a call is found. Functions which take a field parameter though, are rerun for every record. It is typically only the latter scenario where the performance hit is to be found.
Apr 30 '12 #3

TheSmileyCoder
Expert Mod 100+
P: 2,321
Thank you both for your replies. Its also a very important P.s. that you bring up there, NeoPa. I still remember when I wasn't aware of that particular issue, and spent a good amount of time figuring it out. Another point to make is that Access (at least AC2010) will not always calculate the function for all rows, but simply for those visible.
Apr 30 '12 #4

NeoPa
Expert Mod 15k+
P: 31,487
Smiley:
Another point to make is that Access (at least AC2010) will not always calculate the function for all rows, but simply for those visible.
Indeed :-)

The penultimate paragraph (the one before the PS.) covers this area. If the call is within the WHERE clause then all rows that are checked will cause the call to be processed, whereas if it's outside of the WHERE clause then only those rows that pass the filtering stage will cause the call to be processed.
May 1 '12 #5

TheSmileyCoder
Expert Mod 100+
P: 2,321
NeoPa:
The penultimate paragraph (the one before the PS.) covers this area. If the call is within the WHERE clause then all rows that are checked will cause the call to be processed, whereas if it's outside of the WHERE clause then only those rows that pass the filtering stage will cause the call to be processed.
It is certainly true that if the call is within the Where clause it will be processed for each row (And I presume the same goes for ORDER BY).

My point was slighly different, in that if you have a continous form showing 10 out of 100 records, the function will(might) only be processed for the first 10 rows, and only for the next 10 rows on-demand (I.e. when user scrolls)
May 2 '12 #6

NeoPa
Expert Mod 15k+
P: 31,487
Ah. I see. Let me deal with each point in turn.

ORDER BY fields still only effect those records which have passed the WHERE test. As such, will cause delay only for records included. However, it can sometimes force the whole of the dataset to be processed before any records are shown which might not otherwise be necessary. See next paragraph.

Displayed records are not how it works exactly. There is an element of processing a set number of records only if the data set is large, but this buffer is not restricted to those records currently displayed. The value you would be looking for (I believe) is found as .MaxRecords on a query level, and the default is set in Tools | Options | Edit/Find (or the 2007/2010 equivalent).
May 2 '12 #7

Post your reply

Sign in to post your reply or Sign up for a free account.