473,699 Members | 2,078 Online
Bytes | Software Development & Data Engineering Community
+ Post

Home Posts Topics Members FAQ

How to Energize a Dreadfully Slow Query

There must be a way to enhance the performance of a query, or find a
plausible workaround, but I seem to be hitting a wall. I have tried a
few tweaks, however, there has been no improvement.

Simply, I'm including one calcualtion from a separate table/query. It
sums the total units sold to date by ProductID number and is used in
other select queries to perform various calculations. Perhaps there is
an advantage in working with a maximum number of records that would be
used in the database in order to fully evaluate real-world conditions.
In this case, there are nearly 10,000 records.

I have tried DSum and a Function SumUnits(Produc tID), but the
performance of either is nearly the same. It takes about 25 seconds
after clicking sort or filter on any of the calculated columns on the
form before it is completed - yuk. Tried a Totals query to check the
speed and voila, just a couple of seconds, but of course, it doesn't
matter how or where a Totals query is used as it renders any other
select query non-updatable.

Have thought about using an Update query to store a 'calculated value'
as an alternative, but that method is generally boohooed. But even
going that route, it would still take at least 25 seconds everytime the
form is opened to refresh the numbers, but of course, once opened any
resorting or refiltering would be instant.

All of the other calculations on the query are very basic, no DSum's,
no DLookup's, etc. And by temporarily removing the UnitsSold to date
field above, performance is under a second. Any suggestions will be
welcomed.

Thanks, Rolan

Nov 13 '05 #1
15 5645
You could open a recordset from code that is the group by w/ sum, and create a
function that uses FindFirst to get the correct sum from the open recordset.

On 28 Jan 2005 07:03:37 -0800, "Rolan" <co******@saf e-mail.net> wrote:
There must be a way to enhance the performance of a query, or find a
plausible workaround, but I seem to be hitting a wall. I have tried a
few tweaks, however, there has been no improvement.

Simply, I'm including one calcualtion from a separate table/query. It
sums the total units sold to date by ProductID number and is used in
other select queries to perform various calculations. Perhaps there is
an advantage in working with a maximum number of records that would be
used in the database in order to fully evaluate real-world conditions.
In this case, there are nearly 10,000 records.

I have tried DSum and a Function SumUnits(Produc tID), but the
performance of either is nearly the same. It takes about 25 seconds
after clicking sort or filter on any of the calculated columns on the
form before it is completed - yuk. Tried a Totals query to check the
speed and voila, just a couple of seconds, but of course, it doesn't
matter how or where a Totals query is used as it renders any other
select query non-updatable.

Have thought about using an Update query to store a 'calculated value'
as an alternative, but that method is generally boohooed. But even
going that route, it would still take at least 25 seconds everytime the
form is opened to refresh the numbers, but of course, once opened any
resorting or refiltering would be instant.

All of the other calculations on the query are very basic, no DSum's,
no DLookup's, etc. And by temporarily removing the UnitsSold to date
field above, performance is under a second. Any suggestions will be
welcomed.

Thanks, Rolan


Nov 13 '05 #2
On 28 Jan 2005 07:03:37 -0800, "Rolan" <co******@saf e-mail.net> wrote:

Hi
Not sure I unerstand this but if the totals query is so fast, can't
you make a temporary table from it and use this to make the main
query/queries updateable?
David

Nov 13 '05 #3
Well, let's think about your form design. If you have a continuous style
form where one of the controls is a Domain function (DSum, DCount, etc.),
then a query to calculate that value is run once for each record. As you've
discovered, this can take some time, when loading many records. Besides the
(perhaps not so) obvious help that indexing on any/most fields you search or
sort, I think you might consider removing the calculated value from the
continuous detail section, and moving it to a pop-up form that users could
view on demand. Put a command button in the main form's header/footer that
would open that form, with a criteria link to the current detail record.
-Ed

"Rolan" <co******@saf e-mail.net> wrote in message
news:11******** **************@ c13g2000cwb.goo glegroups.com.. .
There must be a way to enhance the performance of a query, or find a
plausible workaround, but I seem to be hitting a wall. I have tried a
few tweaks, however, there has been no improvement.

Simply, I'm including one calcualtion from a separate table/query. It
sums the total units sold to date by ProductID number and is used in
other select queries to perform various calculations. Perhaps there is
an advantage in working with a maximum number of records that would be
used in the database in order to fully evaluate real-world conditions.
In this case, there are nearly 10,000 records.

I have tried DSum and a Function SumUnits(Produc tID), but the
performance of either is nearly the same. It takes about 25 seconds
after clicking sort or filter on any of the calculated columns on the
form before it is completed - yuk. Tried a Totals query to check the
speed and voila, just a couple of seconds, but of course, it doesn't
matter how or where a Totals query is used as it renders any other
select query non-updatable.

Have thought about using an Update query to store a 'calculated value'
as an alternative, but that method is generally boohooed. But even
going that route, it would still take at least 25 seconds everytime the
form is opened to refresh the numbers, but of course, once opened any
resorting or refiltering would be instant.

All of the other calculations on the query are very basic, no DSum's,
no DLookup's, etc. And by temporarily removing the UnitsSold to date
field above, performance is under a second. Any suggestions will be
welcomed.

Thanks, Rolan

Nov 13 '05 #4
On 28 Jan 2005 10:08:04 -0600, d.************* **@blueyonder.c o.uk (David
Schofield) wrote:
On 28 Jan 2005 07:03:37 -0800, "Rolan" <co******@saf e-mail.net> wrote:

Hi
Not sure I unerstand this but if the totals query is so fast, can't
you make a temporary table from it and use this to make the main
query/queries updateable?
David


People like me keep advocating against Temp tables <g>.

There's a variant on Temp tables that is less problematic. Create a permanent
table in the back-end database that is cache of the group/sum that gets
updated whenever the primary data is changed. This does require a periodic
maintenance process to check and correct the integrity of the sum, though.
Nov 13 '05 #5
Thanks everyone for your input. I'm thinking that a variation of the
theme might be a consideration. Since keeping the process reasonably
transparent and seamless is important, perhaps having an Update query
to run on close of the invoice form to post the current UnitsSold
numbers to a new field in the products table might be an acceptable
workaround. Well, the saltmine awaits my return . . .
Steve Jorgensen wrote:
On 28 Jan 2005 10:08:04 -0600, d.************* **@blueyonder.c o.uk (David Schofield) wrote:
On 28 Jan 2005 07:03:37 -0800, "Rolan" <co******@saf e-mail.net> wrote:
Hi
Not sure I unerstand this but if the totals query is so fast, can't
you make a temporary table from it and use this to make the main
query/queries updateable?
David
People like me keep advocating against Temp tables <g>.

There's a variant on Temp tables that is less problematic. Create a

permanent table in the back-end database that is cache of the group/sum that gets updated whenever the primary data is changed. This does require a periodic maintenance process to check and correct the integrity of the sum,

though.

Nov 13 '05 #6
On Fri, 28 Jan 2005 08:21:40 -0800, Steve Jorgensen
<no****@nospam. nospam> wrote:
On 28 Jan 2005 10:08:04 -0600, d.************* **@blueyonder.c o.uk (David
Schofield) wrote:
On 28 Jan 2005 07:03:37 -0800, "Rolan" <co******@saf e-mail.net> wrote:

Hi
Not sure I unerstand this but if the totals query is so fast, can't
you make a temporary table from it and use this to make the main
query/queries updateable?
David


People like me keep advocating against Temp tables <g>.

There's a variant on Temp tables that is less problematic. Create a permanent
table in the back-end database that is cache of the group/sum that gets
updated whenever the primary data is changed. This does require a periodic
maintenance process to check and correct the integrity of the sum, though.

Hi
I know what you mean about avoiding temporary tables.
Aggregation in real time can be a problem, and presummarisatio n is
unavoidable for large systems, eg OLAP for multi-dimensional data.
However Rolan's problem isn't that the aggregation takes too much time
but that the result of it isn't usable in the context of his query.
How would this be done in other databases?
David

Nov 13 '05 #7
"Rolan" <co******@saf e-mail.net> wrote in
news:11******** *************@c 13g2000cwb.goog legroups.com:
Thanks everyone for your input. I'm thinking that a variation of
the theme might be a consideration. Since keeping the process
reasonably transparent and seamless is important, perhaps having
an Update query to run on close of the invoice form to post the
current UnitsSold numbers to a new field in the products table
might be an acceptable workaround.


Don't do it. It will eventually fail.

On a billing app that I created early in my Access career, I did
this to maintain open balances on invoices because the machines the
client was running were substantially slower than my own (I was
running a P120, but they were running P100s that were half as fast
as mine; they were off-brand machines and laptops with very slow
hard drives).

The calculated data was never accurate. It never stayed accurate. It
always got out of line, and it was hard to tell *when* it was wrong,
so you could never depend on the results.

Now, if you are using something other than an MDB to store your data
and have triggers available, that's a different story.

--
David W. Fenton http://www.bway.net/~dfenton
dfenton at bway dot net http://www.bway.net/~dfassoc
Nov 13 '05 #8
Steve Jorgensen <no****@nospam. nospam> wrote in
news:1d******** *************** *********@4ax.c om:
People like me keep advocating against Temp tables <g>.


I think it's a mistake. I learned long ago that sometimes a temp
table is faster than doing the same process in memory, precisely
because derived data can then be indexed, which can vastly speed up
certain kinds of data retrieval processes.

Yes, it seems counter-intuitive, but it turns out that temp tables
don't slow things down at all in those kinds of circumstances.

--
David W. Fenton http://www.bway.net/~dfenton
dfenton at bway dot net http://www.bway.net/~dfassoc
Nov 13 '05 #9
Rolan wrote:
There must be a way to enhance the performance of a query, or find a
plausible workaround, but I seem to be hitting a wall. I have tried a
few tweaks, however, there has been no improvement.

Simply, I'm including one calcualtion from a separate table/query. It
sums the total units sold to date by ProductID number and is used in
other select queries to perform various calculations. Perhaps there is
an advantage in working with a maximum number of records that would be
used in the database in order to fully evaluate real-world conditions.
In this case, there are nearly 10,000 records.

I have tried DSum and a Function SumUnits(Produc tID), but the
performance of either is nearly the same. It takes about 25 seconds
after clicking sort or filter on any of the calculated columns on the
form before it is completed - yuk. Tried a Totals query to check the
speed and voila, just a couple of seconds, but of course, it doesn't
matter how or where a Totals query is used as it renders any other
select query non-updatable.

Have thought about using an Update query to store a 'calculated value'
as an alternative, but that method is generally boohooed. But even
going that route, it would still take at least 25 seconds everytime the
form is opened to refresh the numbers, but of course, once opened any
resorting or refiltering would be instant.

All of the other calculations on the query are very basic, no DSum's,
no DLookup's, etc. And by temporarily removing the UnitsSold to date
field above, performance is under a second. Any suggestions will be
welcomed.

Thanks, Rolan


Do you have a separate totals query? One that gets the total number of
units to date and groups on the product id and may also filter within a
date range? I should think that if you link your query to this totals
query it would zip along quickly.

Are you filtering on a field that isn't indexed?

10,000 records is not a large amount.

Also, is there a need to perform a calculation in the query? If you can
pull the data together first and then present the calculation in the
form or report that would be the optimum method.
Nov 13 '05 #10

This thread has been closed and replies have been disabled. Please start a new discussion.

Similar topics

5
3139
by: Shay | last post by:
essentially I am trying to do some counts based on some assumptions in the recordset. So I get the RS back, put the values into a variable, move to the next record in the RS and compare what is in the variable to the value in the next record in the recordset and do a count. Then overwrite the value in the variables and do the same for the next record and so. But this runs extremly slow. 5000 records takes about 10 minutes in IE6 and I...
2
5346
by: Tim Fountain | last post by:
We've recently enabled slow query logging on a server and it's proving interesting seeing which queries are bogging things down. This one is puzzling me a little: SELECT articleid, type, authorid, authorname, text, posted FROM comments WHERE status = 'normal' ORDER BY posted DESC LIMIT 5; The purpose of this query is to list the five most recent (non-deleted) comments. Here is the table structure:
1
1649
by: Thomas Bartkus | last post by:
Is it possible that the elapsed time being measured includes waiting for the client to acknowledge that it has received all the data? In *addition* to the server execution time? Documentation seems to *imply* that the slow query log only looks at server execution time. But, it doesn't acknowledge this directly and there seems to be a distinct connection between slow network pipes and slow queries. For example - even the simplest...
2
6233
by: AG | last post by:
I am able to run a query which runs FAst in QA but slow in the application.It takes about 16 m in QA but 1000 ms on the Application.What I wanted to know is why would the query take a long time in the application when it runs fast on SQL server? How should we try debugging it? Ajay
2
2012
by: Yonatan Goraly | last post by:
I am in the process of adding PostgreSQL support for an application, in addition to Oracle and MS SQL. I am using PostgreSQL version 7.3.2, Red Hat 9.0 on Intel Pentium III board. I have a query that generally looks like this: SELECT t1.col1, t2.col1 FROM t1, t2 WHERE t1.x=t2.y AND t2.p='string' AND t2.q=1 This query is strikingly slow (about 100 sec when both t1 and t2 has
3
4795
by: Janross | last post by:
I'm having trouble with a query that's prohibitively slow. On my free-standing office computer it's fine (well, 2-4 seconds), but on the client's network, it takes at least 5 minutes to run. Obviously not workable! I know where the problem is, I just don't know how to fix it. The query calls a function, and I assume it gets slow because the function runs on every record. So--is there a way to rewrite the function so it's quicker?...
0
2668
by: Dave Hammond | last post by:
Hi All, I'm trying to use the slow-query-log (with --log-queries-not-using-indexes enabled) to determine if any queries need optimization, and have a few questions about some entries I'm seeing. First question: given a table defined as: CREATE TABLE `oa_location` (
9
19151
by: Emin | last post by:
Dear Experts, I have a fairly simple query in which adding a where clause slows things down by at least a factor of 100. The following is the slow version of the query ------------------------- SELECT * FROM ( Select x.event_date From x FULL OUTER JOIN y ON x.event_date = y.event_date
2
9839
by: existential.philosophy | last post by:
This is a new problem for me: I have some queries that open very slowly in design view. My benchmark query takes about 20 minutes to open in design view. That same query takes about 20 minutes to open in datasheet view. As an experiment, I deleted all rows in all tables; after that, the query took only seconds to open in both design view and datasheet view. From these facts, I conclude that Access is evaluating the query when I go to...
0
8706
marktang
by: marktang | last post by:
ONU (Optical Network Unit) is one of the key components for providing high-speed Internet services. Its primary function is to act as an endpoint device located at the user's premises. However, people are often confused as to whether an ONU can Work As a Router. In this blog post, we’ll explore What is ONU, What Is Router, ONU & Router’s main usage, and What is the difference between ONU and Router. Let’s take a closer look ! Part I. Meaning of...
0
9199
Oralloy
by: Oralloy | last post by:
Hello folks, I am unable to find appropriate documentation on the type promotion of bit-fields when using the generalised comparison operator "<=>". The problem is that using the GNU compilers, it seems that the internal comparison operator "<=>" tries to promote arguments from unsigned to signed. This is as boiled down as I can make it. Here is my compilation command: g++-12 -std=c++20 -Wnarrowing bit_field.cpp Here is the code in...
1
8944
by: Hystou | last post by:
Overview: Windows 11 and 10 have less user interface control over operating system update behaviour than previous versions of Windows. In Windows 11 and 10, there is no way to turn off the Windows Update option using the Control Panel or Settings app; it automatically checks for updates and installs any it finds, whether you like it or not. For most users, this new feature is actually very convenient. If you want to control the update process,...
0
8899
tracyyun
by: tracyyun | last post by:
Dear forum friends, With the development of smart home technology, a variety of wireless communication protocols have appeared on the market, such as Zigbee, Z-Wave, Wi-Fi, Bluetooth, etc. Each protocol has its own unique characteristics and advantages, but as a user who is planning to build a smart home system, I am a bit confused by the choice of these technologies. I'm particularly interested in Zigbee because I've heard it does some...
1
6550
isladogs
by: isladogs | last post by:
The next Access Europe User Group meeting will be on Wednesday 1 May 2024 starting at 18:00 UK time (6PM UTC+1) and finishing by 19:30 (7.30PM). In this session, we are pleased to welcome a new presenter, Adolph Dupré who will be discussing some powerful techniques for using class modules. He will explain when you may want to use classes instead of User Defined Types (UDT). For example, to manage the data in unbound forms. Adolph will...
0
5889
by: conductexam | last post by:
I have .net C# application in which I am extracting data from word file and save it in database particularly. To store word all data as it is I am converting the whole word file firstly in HTML and then checking html paragraph one by one. At the time of converting from word file to html my equations which are in the word document file was convert into image. Globals.ThisAddIn.Application.ActiveDocument.Select();...
0
4391
by: TSSRALBI | last post by:
Hello I'm a network technician in training and I need your help. I am currently learning how to create and manage the different types of VPNs and I have a question about LAN-to-LAN VPNs. The last exercise I practiced was to create a LAN-to-LAN VPN between two Pfsense firewalls, by using IPSEC protocols. I succeeded, with both firewalls in the same network. But I'm wondering if it's possible to do the same thing, with 2 Pfsense firewalls...
1
3075
by: 6302768590 | last post by:
Hai team i want code for transfer the data from one system to another through IP address by using C# our system has to for every 5mins then we have to update the data what the data is updated we have to send another system
3
2016
bsmnconsultancy
by: bsmnconsultancy | last post by:
In today's digital era, a well-designed website is crucial for businesses looking to succeed. Whether you're a small business owner or a large corporation in Toronto, having a strong online presence can significantly impact your brand's success. BSMN Consultancy, a leader in Website Development in Toronto offers valuable insights into creating effective websites that not only look great but also perform exceptionally well. In this comprehensive...

By using Bytes.com and it's services, you agree to our Privacy Policy and Terms of Use.

To disable or enable advertisements and analytics tracking please visit the manage ads & tracking page.