472,328 Members | 1,522 Online
Bytes | Software Development & Data Engineering Community
+ Post

Home Posts Topics Members FAQ

Join Bytes to post your question to a community of 472,328 software developers and data experts.

How to Energize a Dreadfully Slow Query

There must be a way to enhance the performance of a query, or find a
plausible workaround, but I seem to be hitting a wall. I have tried a
few tweaks, however, there has been no improvement.

Simply, I'm including one calcualtion from a separate table/query. It
sums the total units sold to date by ProductID number and is used in
other select queries to perform various calculations. Perhaps there is
an advantage in working with a maximum number of records that would be
used in the database in order to fully evaluate real-world conditions.
In this case, there are nearly 10,000 records.

I have tried DSum and a Function SumUnits(ProductID), but the
performance of either is nearly the same. It takes about 25 seconds
after clicking sort or filter on any of the calculated columns on the
form before it is completed - yuk. Tried a Totals query to check the
speed and voila, just a couple of seconds, but of course, it doesn't
matter how or where a Totals query is used as it renders any other
select query non-updatable.

Have thought about using an Update query to store a 'calculated value'
as an alternative, but that method is generally boohooed. But even
going that route, it would still take at least 25 seconds everytime the
form is opened to refresh the numbers, but of course, once opened any
resorting or refiltering would be instant.

All of the other calculations on the query are very basic, no DSum's,
no DLookup's, etc. And by temporarily removing the UnitsSold to date
field above, performance is under a second. Any suggestions will be
welcomed.

Thanks, Rolan

Nov 13 '05 #1
15 5487
You could open a recordset from code that is the group by w/ sum, and create a
function that uses FindFirst to get the correct sum from the open recordset.

On 28 Jan 2005 07:03:37 -0800, "Rolan" <co******@safe-mail.net> wrote:
There must be a way to enhance the performance of a query, or find a
plausible workaround, but I seem to be hitting a wall. I have tried a
few tweaks, however, there has been no improvement.

Simply, I'm including one calcualtion from a separate table/query. It
sums the total units sold to date by ProductID number and is used in
other select queries to perform various calculations. Perhaps there is
an advantage in working with a maximum number of records that would be
used in the database in order to fully evaluate real-world conditions.
In this case, there are nearly 10,000 records.

I have tried DSum and a Function SumUnits(ProductID), but the
performance of either is nearly the same. It takes about 25 seconds
after clicking sort or filter on any of the calculated columns on the
form before it is completed - yuk. Tried a Totals query to check the
speed and voila, just a couple of seconds, but of course, it doesn't
matter how or where a Totals query is used as it renders any other
select query non-updatable.

Have thought about using an Update query to store a 'calculated value'
as an alternative, but that method is generally boohooed. But even
going that route, it would still take at least 25 seconds everytime the
form is opened to refresh the numbers, but of course, once opened any
resorting or refiltering would be instant.

All of the other calculations on the query are very basic, no DSum's,
no DLookup's, etc. And by temporarily removing the UnitsSold to date
field above, performance is under a second. Any suggestions will be
welcomed.

Thanks, Rolan


Nov 13 '05 #2
On 28 Jan 2005 07:03:37 -0800, "Rolan" <co******@safe-mail.net> wrote:

Hi
Not sure I unerstand this but if the totals query is so fast, can't
you make a temporary table from it and use this to make the main
query/queries updateable?
David

Nov 13 '05 #3
Well, let's think about your form design. If you have a continuous style
form where one of the controls is a Domain function (DSum, DCount, etc.),
then a query to calculate that value is run once for each record. As you've
discovered, this can take some time, when loading many records. Besides the
(perhaps not so) obvious help that indexing on any/most fields you search or
sort, I think you might consider removing the calculated value from the
continuous detail section, and moving it to a pop-up form that users could
view on demand. Put a command button in the main form's header/footer that
would open that form, with a criteria link to the current detail record.
-Ed

"Rolan" <co******@safe-mail.net> wrote in message
news:11**********************@c13g2000cwb.googlegr oups.com...
There must be a way to enhance the performance of a query, or find a
plausible workaround, but I seem to be hitting a wall. I have tried a
few tweaks, however, there has been no improvement.

Simply, I'm including one calcualtion from a separate table/query. It
sums the total units sold to date by ProductID number and is used in
other select queries to perform various calculations. Perhaps there is
an advantage in working with a maximum number of records that would be
used in the database in order to fully evaluate real-world conditions.
In this case, there are nearly 10,000 records.

I have tried DSum and a Function SumUnits(ProductID), but the
performance of either is nearly the same. It takes about 25 seconds
after clicking sort or filter on any of the calculated columns on the
form before it is completed - yuk. Tried a Totals query to check the
speed and voila, just a couple of seconds, but of course, it doesn't
matter how or where a Totals query is used as it renders any other
select query non-updatable.

Have thought about using an Update query to store a 'calculated value'
as an alternative, but that method is generally boohooed. But even
going that route, it would still take at least 25 seconds everytime the
form is opened to refresh the numbers, but of course, once opened any
resorting or refiltering would be instant.

All of the other calculations on the query are very basic, no DSum's,
no DLookup's, etc. And by temporarily removing the UnitsSold to date
field above, performance is under a second. Any suggestions will be
welcomed.

Thanks, Rolan

Nov 13 '05 #4
On 28 Jan 2005 10:08:04 -0600, d.***************@blueyonder.co.uk (David
Schofield) wrote:
On 28 Jan 2005 07:03:37 -0800, "Rolan" <co******@safe-mail.net> wrote:

Hi
Not sure I unerstand this but if the totals query is so fast, can't
you make a temporary table from it and use this to make the main
query/queries updateable?
David


People like me keep advocating against Temp tables <g>.

There's a variant on Temp tables that is less problematic. Create a permanent
table in the back-end database that is cache of the group/sum that gets
updated whenever the primary data is changed. This does require a periodic
maintenance process to check and correct the integrity of the sum, though.
Nov 13 '05 #5
Thanks everyone for your input. I'm thinking that a variation of the
theme might be a consideration. Since keeping the process reasonably
transparent and seamless is important, perhaps having an Update query
to run on close of the invoice form to post the current UnitsSold
numbers to a new field in the products table might be an acceptable
workaround. Well, the saltmine awaits my return . . .
Steve Jorgensen wrote:
On 28 Jan 2005 10:08:04 -0600, d.***************@blueyonder.co.uk (David Schofield) wrote:
On 28 Jan 2005 07:03:37 -0800, "Rolan" <co******@safe-mail.net> wrote:
Hi
Not sure I unerstand this but if the totals query is so fast, can't
you make a temporary table from it and use this to make the main
query/queries updateable?
David
People like me keep advocating against Temp tables <g>.

There's a variant on Temp tables that is less problematic. Create a

permanent table in the back-end database that is cache of the group/sum that gets updated whenever the primary data is changed. This does require a periodic maintenance process to check and correct the integrity of the sum,

though.

Nov 13 '05 #6
On Fri, 28 Jan 2005 08:21:40 -0800, Steve Jorgensen
<no****@nospam.nospam> wrote:
On 28 Jan 2005 10:08:04 -0600, d.***************@blueyonder.co.uk (David
Schofield) wrote:
On 28 Jan 2005 07:03:37 -0800, "Rolan" <co******@safe-mail.net> wrote:

Hi
Not sure I unerstand this but if the totals query is so fast, can't
you make a temporary table from it and use this to make the main
query/queries updateable?
David


People like me keep advocating against Temp tables <g>.

There's a variant on Temp tables that is less problematic. Create a permanent
table in the back-end database that is cache of the group/sum that gets
updated whenever the primary data is changed. This does require a periodic
maintenance process to check and correct the integrity of the sum, though.

Hi
I know what you mean about avoiding temporary tables.
Aggregation in real time can be a problem, and presummarisation is
unavoidable for large systems, eg OLAP for multi-dimensional data.
However Rolan's problem isn't that the aggregation takes too much time
but that the result of it isn't usable in the context of his query.
How would this be done in other databases?
David

Nov 13 '05 #7
"Rolan" <co******@safe-mail.net> wrote in
news:11*********************@c13g2000cwb.googlegro ups.com:
Thanks everyone for your input. I'm thinking that a variation of
the theme might be a consideration. Since keeping the process
reasonably transparent and seamless is important, perhaps having
an Update query to run on close of the invoice form to post the
current UnitsSold numbers to a new field in the products table
might be an acceptable workaround.


Don't do it. It will eventually fail.

On a billing app that I created early in my Access career, I did
this to maintain open balances on invoices because the machines the
client was running were substantially slower than my own (I was
running a P120, but they were running P100s that were half as fast
as mine; they were off-brand machines and laptops with very slow
hard drives).

The calculated data was never accurate. It never stayed accurate. It
always got out of line, and it was hard to tell *when* it was wrong,
so you could never depend on the results.

Now, if you are using something other than an MDB to store your data
and have triggers available, that's a different story.

--
David W. Fenton http://www.bway.net/~dfenton
dfenton at bway dot net http://www.bway.net/~dfassoc
Nov 13 '05 #8
Steve Jorgensen <no****@nospam.nospam> wrote in
news:1d********************************@4ax.com:
People like me keep advocating against Temp tables <g>.


I think it's a mistake. I learned long ago that sometimes a temp
table is faster than doing the same process in memory, precisely
because derived data can then be indexed, which can vastly speed up
certain kinds of data retrieval processes.

Yes, it seems counter-intuitive, but it turns out that temp tables
don't slow things down at all in those kinds of circumstances.

--
David W. Fenton http://www.bway.net/~dfenton
dfenton at bway dot net http://www.bway.net/~dfassoc
Nov 13 '05 #9
Rolan wrote:
There must be a way to enhance the performance of a query, or find a
plausible workaround, but I seem to be hitting a wall. I have tried a
few tweaks, however, there has been no improvement.

Simply, I'm including one calcualtion from a separate table/query. It
sums the total units sold to date by ProductID number and is used in
other select queries to perform various calculations. Perhaps there is
an advantage in working with a maximum number of records that would be
used in the database in order to fully evaluate real-world conditions.
In this case, there are nearly 10,000 records.

I have tried DSum and a Function SumUnits(ProductID), but the
performance of either is nearly the same. It takes about 25 seconds
after clicking sort or filter on any of the calculated columns on the
form before it is completed - yuk. Tried a Totals query to check the
speed and voila, just a couple of seconds, but of course, it doesn't
matter how or where a Totals query is used as it renders any other
select query non-updatable.

Have thought about using an Update query to store a 'calculated value'
as an alternative, but that method is generally boohooed. But even
going that route, it would still take at least 25 seconds everytime the
form is opened to refresh the numbers, but of course, once opened any
resorting or refiltering would be instant.

All of the other calculations on the query are very basic, no DSum's,
no DLookup's, etc. And by temporarily removing the UnitsSold to date
field above, performance is under a second. Any suggestions will be
welcomed.

Thanks, Rolan


Do you have a separate totals query? One that gets the total number of
units to date and groups on the product id and may also filter within a
date range? I should think that if you link your query to this totals
query it would zip along quickly.

Are you filtering on a field that isn't indexed?

10,000 records is not a large amount.

Also, is there a need to perform a calculation in the query? If you can
pull the data together first and then present the calculation in the
form or report that would be the optimum method.
Nov 13 '05 #10
On Sat, 29 Jan 2005 18:47:19 GMT, Salad <oi*@vinegar.com> wrote:
Rolan wrote: ....
Do you have a separate totals query? One that gets the total number of
units to date and groups on the product id and may also filter within a
date range? I should think that if you link your query to this totals
query it would zip along quickly.


It sounds as if he tried that first, but he needed it to be editable, and that
query isn't.

Nov 13 '05 #11
Please excuse the lack of punctuality and thanks for all of the
suggestions from everyone. I implemented the following and so far it
seems to work very well, not to mention that it is extremely fast by
comparsion. I presume this is a derivative of a temporary table
process. Perhaps on a small network of 3 or 4 users most potential
problems should be mitigated.

I took the Totals query (PartID, PartNumber and SoldQty) and converted
it to a Make-Table query, then added code to the OnClose event of the
Invoice form to update (replace) the table with the most current,
accumulative sold parts data. Then in the four or five other queries
that include the sold to date column, I linked the Make-Table PartID
number to the same of tblParts. The calculations of other columns using
the sold data work fine, but the performance boost in resorting and
filtering is in a word "astonishing", plus they all remain updatable
queries. Even a small time lag of a few minutes before an invoice is
closed is tenable and should cause no adverse effects.

Thanks again,

Rolan
Steve Jorgensen wrote:
On Sat, 29 Jan 2005 18:47:19 GMT, Salad <oi*@vinegar.com> wrote:
Rolan wrote: ...

Do you have a separate totals query? One that gets the total number ofunits to date and groups on the product id and may also filter within adate range? I should think that if you link your query to this totalsquery it would zip along quickly.


It sounds as if he tried that first, but he needed it to be editable,

and that query isn't.


Nov 13 '05 #12
If you're using a Make Table query, it's not a derivative of the temporary
table solution, it -is- the temporary table solution. That's OK, but be aware
of the pitfalls.

Your front-end will now bloat, and need to be compacted, plus the extra
activity means that it's more likely the front-end may become corrupted at
some point. Finally, in spite of your beste efforts, it may not be possilbe
to completely prevent the front-end from being opened in more than on Access
session at once, so 2 people performing the same operation will step on each
other.

On 30 Jan 2005 18:26:00 -0800, "Rolan" <co******@safe-mail.net> wrote:
Please excuse the lack of punctuality and thanks for all of the
suggestions from everyone. I implemented the following and so far it
seems to work very well, not to mention that it is extremely fast by
comparsion. I presume this is a derivative of a temporary table
process. Perhaps on a small network of 3 or 4 users most potential
problems should be mitigated.

I took the Totals query (PartID, PartNumber and SoldQty) and converted
it to a Make-Table query, then added code to the OnClose event of the
Invoice form to update (replace) the table with the most current,
accumulative sold parts data. Then in the four or five other queries
that include the sold to date column, I linked the Make-Table PartID
number to the same of tblParts. The calculations of other columns using
the sold data work fine, but the performance boost in resorting and
filtering is in a word "astonishing", plus they all remain updatable
queries. Even a small time lag of a few minutes before an invoice is
closed is tenable and should cause no adverse effects.

Thanks again,

Rolan
Steve Jorgensen wrote:
On Sat, 29 Jan 2005 18:47:19 GMT, Salad <oi*@vinegar.com> wrote:
>Rolan wrote:

...
>
>Do you have a separate totals query? One that gets the total numberof >units to date and groups on the product id and may also filterwithin a >date range? I should think that if you link your query to thistotals >query it would zip along quickly.


It sounds as if he tried that first, but he needed it to be editable,

and that
query isn't.


Nov 13 '05 #13
Thanks Steve for the word of discouragement :-( Although thinking
about matters of eventuality is a prudent consideration. Prior to the
Make-Table, I had used the following Function as created by Trevor
Best, but specifically constructed for the SoldQty totals only. I
believe you had alluded to using the OpenRecordset method as a better
alternative. Is this more or less what you had in mind?

Using this Function or DSum were nearly identical in performance and
time (25 seconds) when resorting or filtering any of the forms'
calculated columns that contained the SoldQty. Why is it that the
Access Totals query is so much faster? Although I understand why a
Totals query is not updatable, why does it also prevent other query
fields from being updated when linked? Thanks again.

Function tSum (pstrField As String, pstrTable As String, pst*rCriteria
As
String) As Double
On Error GoTo tSum_Err
' Function tSum
' Purpose: Replace DSum, which is slow on attached table*s
' Created: 1 Feb 1996 T.Best
Dim dbCurrent As Database
Dim rstLookup As Recordset
Dim dblValue As Double
Set dbCurrent = dbengine(0)(0)
If pstrCriteria = "" Then
Set rstLookup = dbCurrent.OpenRecordset("Select Sum(*[" &
pstrField & "]) From [" & pstrTable & "]", DB_OPEN_SNAPSHOT)
Else
Set rstLookup = dbCurrent.OpenRecordset("Select Sum(*[" &
pstrField & "]) From [" & pstrTable & "] Where " & pstrCrite*ria,
DB_OPEN_SNAPSHOT)
End If
If Not rstLookup.bof Then
rstLookup.MoveFirst
dblValue = rstLookup(0)
Else
dblValue = 0
End If
rstLookup.Close
tSum = dblValue
tSum_Exit:
On Error Resume Next
rstLookup.Close
Exit Function
tSum_Err:
MsgBox Error, 16, "Error " & Err
Resume tSum_Exit
End Function

Steve Jorgensen wrote:
If you're using a Make Table query, it's not a derivative of the temporary table solution, it -is- the temporary table solution. That's OK, but be aware of the pitfalls.

Your front-end will now bloat, and need to be compacted, plus the extra activity means that it's more likely the front-end may become corrupted at some point. Finally, in spite of your beste efforts, it may not be possilbe to completely prevent the front-end from being opened in more than on Access session at once, so 2 people performing the same operation will step on each other.

On 30 Jan 2005 18:26:00 -0800, "Rolan" <co******@safe-mail.net> wrote:
Please excuse the lack of punctuality and thanks for all of the
suggestions from everyone. I implemented the following and so far it
seems to work very well, not to mention that it is extremely fast by
comparsion. I presume this is a derivative of a temporary table
process. Perhaps on a small network of 3 or 4 users most potential
problems should be mitigated.

I took the Totals query (PartID, PartNumber and SoldQty) and convertedit to a Make-Table query, then added code to the OnClose event of theInvoice form to update (replace) the table with the most current,
accumulative sold parts data. Then in the four or five other queries
that include the sold to date column, I linked the Make-Table PartID
number to the same of tblParts. The calculations of other columns usingthe sold data work fine, but the performance boost in resorting and
filtering is in a word "astonishing", plus they all remain updatable
queries. Even a small time lag of a few minutes before an invoice is
closed is tenable and should cause no adverse effects.

Thanks again,

Rolan
Steve Jorgensen wrote:
On Sat, 29 Jan 2005 18:47:19 GMT, Salad <oi*@vinegar.com> wrote:

>Rolan wrote:
...
>
>Do you have a separate totals query? One that gets the total
numberof
>units to date and groups on the product id and may also filter

within a
>date range? I should think that if you link your query to this

totals
>query it would zip along quickly.

It sounds as if he tried that first, but he needed it to be
editable,and that
query isn't.


Nov 13 '05 #14
On 30 Jan 2005 19:30:15 -0800, "Rolan" <co******@safe-mail.net> wrote:
Thanks Steve for the word of discouragement :-( Although thinking
about matters of eventuality is a prudent consideration. Prior to the
Make-Table, I had used the following Function as created by Trevor
Best, but specifically constructed for the SoldQty totals only. I
believe you had alluded to using the OpenRecordset method as a better
alternative. Is this more or less what you had in mind?

Using this Function or DSum were nearly identical in performance and
time (25 seconds) when resorting or filtering any of the forms'
calculated columns that contained the SoldQty. Why is it that the
Access Totals query is so much faster? Although I understand why a
Totals query is not updatable, why does it also prevent other query
fields from being updated when linked? Thanks again.


Ouch - I didn't realize that you wanted to sort or filter on an aggregate
value column. Sorting or filtering on a value that's computed by calling a
procedure is one of the slowest things Access can do. It's not at all what
Jet was optimized for. If that's really a requirement, perhaps, you truly do
need some kind of temporary table solution.

Probably the cleanest-functioning code, though not terribly simple to
implement is to actually create a temporary database, and create the new table
in that database. Actually, although it's a bear to get going for the first
case, you can reuse the code if you have more cases to implement. The steps
are roughly as follows.

1. Check to see if a temporary database has already been created, and
create/open it if not.

1a. Generate a full path to the new database file. When creating the
temporary database, use a Windows API call to get the path to the temporary
directory, but I recommend rolling your own algorithm for the random temporary
file name (it's simpler). Store the new path string in a global variable.

1b. Create the new database and keep a reference to the open database instance
in a global variable.

2. In code, generate SQL for a make table query to populate the new temporary
table in the temporary database. Use the form "SELECT ... INTO <tablename> IN
<databasepath>". Execute this using a temporary Querydef (created with "" for
Name argument).

3. In code, generate the SQL for your form's recordset, joining the local data
to the temporary table. To do this, you'll need to use a "derived table" that
selects from the table in the temp db "[SELECT ... FROM <tempdbtable> IN
<tempdbpath>]. As [<alias>]" in the FROM clause of the main query (as if it
were simply another table).

4. When the app closes, close and delete the temporary database. You can
implement this using a hidden form that is opened either on application
start-up or before the temporary database is first created. When Access
closes, it must close that form, which can perform the clean-up operations via
its Unload event handler.

It's ugly to describe, but it works, it doesn't bloat the front-end or
increase the risk of corruption, and it's insensitive to whether the same
front-end is being shared by multiple Access sessions.

Nov 13 '05 #15
And I thought that life was easier before Access . . . Thanks Steve for
your pearls of wisdom. I may have a go at it, but I have the feeling
that it will take me awhile. Instead, how about a pointed message on
the forms, do not sort on marked columns unless you like viewing the
shape of an hourglass :-) Huh?

So apparently you have been there and done that? Are there no other
options that one could implement save bloating the front end? I'm still
wondering and waiting for an answer as to why the Totals query is so
fast when you sum the field, and when used in sorting and filtering.
And why does Access prohibit the updating of fields/controls that are
not a part of a Totals query when one is linked?

Here is an idea. If you have the time, perhaps you can email me a copy
of a sample db that has most of the elements and code that you have
described. That would be a nice gesture. And no rush either - just
whenever :-) My email address as shown is a valid one. Thanks.


Steve Jorgensen wrote:
On 30 Jan 2005 19:30:15 -0800, "Rolan" <co******@safe-mail.net> wrote:
Thanks Steve for the word of discouragement :-( Although thinking
about matters of eventuality is a prudent consideration. Prior to theMake-Table, I had used the following Function as created by Trevor
Best, but specifically constructed for the SoldQty totals only. I
believe you had alluded to using the OpenRecordset method as a betteralternative. Is this more or less what you had in mind?

Using this Function or DSum were nearly identical in performance and
time (25 seconds) when resorting or filtering any of the forms'
calculated columns that contained the SoldQty. Why is it that the
Access Totals query is so much faster? Although I understand why a
Totals query is not updatable, why does it also prevent other query
fields from being updated when linked? Thanks again.
Ouch - I didn't realize that you wanted to sort or filter on an

aggregate value column. Sorting or filtering on a value that's computed by calling a procedure is one of the slowest things Access can do. It's not at all what Jet was optimized for. If that's really a requirement, perhaps, you truly do need some kind of temporary table solution.

Probably the cleanest-functioning code, though not terribly simple to
implement is to actually create a temporary database, and create the new table in that database. Actually, although it's a bear to get going for the first case, you can reuse the code if you have more cases to implement. The steps are roughly as follows.

1. Check to see if a temporary database has already been created, and create/open it if not.

1a. Generate a full path to the new database file. When creating the
temporary database, use a Windows API call to get the path to the temporary directory, but I recommend rolling your own algorithm for the random temporary file name (it's simpler). Store the new path string in a global variable.
1b. Create the new database and keep a reference to the open database instance in a global variable.

2. In code, generate SQL for a make table query to populate the new temporary table in the temporary database. Use the form "SELECT ... INTO <tablename> IN <databasepath>". Execute this using a temporary Querydef (created with "" for Name argument).

3. In code, generate the SQL for your form's recordset, joining the local data to the temporary table. To do this, you'll need to use a "derived table" that selects from the table in the temp db "[SELECT ... FROM <tempdbtable> IN <tempdbpath>]. As [<alias>]" in the FROM clause of the main query (as if it were simply another table).

4. When the app closes, close and delete the temporary database. You can implement this using a hidden form that is opened either on application start-up or before the temporary database is first created. When Access closes, it must close that form, which can perform the clean-up operations via its Unload event handler.

It's ugly to describe, but it works, it doesn't bloat the front-end or increase the risk of corruption, and it's insensitive to whether the same front-end is being shared by multiple Access sessions.


Nov 13 '05 #16

This thread has been closed and replies have been disabled. Please start a new discussion.

Similar topics

5
by: Shay | last post by:
essentially I am trying to do some counts based on some assumptions in the recordset. So I get the RS back, put the values into a variable, move to...
2
by: Tim Fountain | last post by:
We've recently enabled slow query logging on a server and it's proving interesting seeing which queries are bogging things down. This one is...
1
by: Thomas Bartkus | last post by:
Is it possible that the elapsed time being measured includes waiting for the client to acknowledge that it has received all the data? In *addition*...
2
by: AG | last post by:
I am able to run a query which runs FAst in QA but slow in the application.It takes about 16 m in QA but 1000 ms on the Application.What I wanted...
2
by: Yonatan Goraly | last post by:
I am in the process of adding PostgreSQL support for an application, in addition to Oracle and MS SQL. I am using PostgreSQL version 7.3.2, Red...
3
by: Janross | last post by:
I'm having trouble with a query that's prohibitively slow. On my free-standing office computer it's fine (well, 2-4 seconds), but on the client's...
0
by: Dave Hammond | last post by:
Hi All, I'm trying to use the slow-query-log (with --log-queries-not-using-indexes enabled) to determine if any queries need optimization, and...
9
by: Emin | last post by:
Dear Experts, I have a fairly simple query in which adding a where clause slows things down by at least a factor of 100. The following is the...
2
by: existential.philosophy | last post by:
This is a new problem for me: I have some queries that open very slowly in design view. My benchmark query takes about 20 minutes to open in...
0
by: tammygombez | last post by:
Hey fellow JavaFX developers, I'm currently working on a project that involves using a ComboBox in JavaFX, and I've run into a bit of an issue....
0
by: tammygombez | last post by:
Hey everyone! I've been researching gaming laptops lately, and I must say, they can get pretty expensive. However, I've come across some great...
0
by: concettolabs | last post by:
In today's business world, businesses are increasingly turning to PowerApps to develop custom business applications. PowerApps is a powerful tool...
0
by: teenabhardwaj | last post by:
How would one discover a valid source for learning news, comfort, and help for engineering designs? Covering through piles of books takes a lot of...
0
by: CD Tom | last post by:
This only shows up in access runtime. When a user select a report from my report menu when they close the report they get a menu I've called Add-ins...
0
by: Naresh1 | last post by:
What is WebLogic Admin Training? WebLogic Admin Training is a specialized program designed to equip individuals with the skills and knowledge...
0
jalbright99669
by: jalbright99669 | last post by:
Am having a bit of a time with URL Rewrite. I need to incorporate http to https redirect with a reverse proxy. I have the URL Rewrite rules made...
0
by: Matthew3360 | last post by:
Hi there. I have been struggling to find out how to use a variable as my location in my header redirect function. Here is my code. ...
0
by: AndyPSV | last post by:
HOW CAN I CREATE AN AI with an .executable file that would suck all files in the folder and on my computerHOW CAN I CREATE AN AI with an .executable...

By using Bytes.com and it's services, you agree to our Privacy Policy and Terms of Use.

To disable or enable advertisements and analytics tracking please visit the manage ads & tracking page.