By using this site, you agree to our updated Privacy Policy and our Terms of Use. Manage your Cookies Settings.
454,967 Members | 1,178 Online
Bytes IT Community
+ Ask a Question
Need help? Post your question and get tips & solutions from a community of 454,967 IT Pros & Developers. It's quick & easy.

slow queries, didn't used to be a problem

tdw
100+
P: 206
Hi all,

I have an "Address" field that, upon an After Update event, searches the database to see if the address I just entered already exists in the database. This search used to only take a little while on the first new record entry. From then on, as long as I kept the database open, this search was practically instantaneous.

I also have a button that I click when I am done entering a new record that, among other things, assigns a file number to the record. It does not use an autonumber field. What it does is search the database for the highest existing file number and adds 1. This search also used to be nearly instantaneous.

Suddenly one day, about two weeks ago, both of these searches started taking forever. I have to wait nearly 30-40 seconds for the address search, and over a minute or two for the file number search.

What might have caused this? I checked to make sure that the fields are indexed. They are.
Oct 7 '08 #1
Share this Question
Share on Google+
20 Replies


missinglinq
Expert 2.5K+
P: 3,532
Is this a unified database or a split (front end/back end) database? Is it a stand-alone or does it run on a network? I ask because it really sounds like a network problem, assuming nothing else had changed in you setup, Access-wise.

We probably also need to see the code you're using for the check.

Linq ;0)>
Oct 7 '08 #2

tdw
100+
P: 206
tdw
Is this a unified database or a split (front end/back end) database? Is it a stand-alone or does it run on a network? I ask because it really sounds like a network problem, assuming nothing else had changed in you setup, Access-wise.

We probably also need to see the code you're using for the check.

Linq ;0)>
That may be. It is a front end/back end, with the back end on a server. Other tasks are slightly slower, such as printing (networked printer), etc. But they don't seem nearly as slow (in proportion to their usual speed) as the searches listed above. It's acting as though it's not indexing.

If you think seeing the code might help, I can try to get it for you tomorrow.
Thanks for the help. I will look into the server question.
Oct 7 '08 #3

tdw
100+
P: 206
tdw
Here are the two bits of code that are running super slow. There doesn't seem to be any slowness to the server running any other programs.

Note: SC_NEW is a table for entering new orders, and for entering and storing quoted orders that have not yet been approved. When a new order is a go-ahead it gets moved to the open orders (SC_OPEN table). Once an order has been completed, it gets moved to the archived orders (SC_ARCH). So there are three tables that these two bits of code search.

This is the code that checks for duplicate addresses. It is an AfterUpdate even in the ADDRESS field on the SC_NEW table:
Expand|Select|Wrap|Line Numbers
  1. Private Sub ADDRESS_AfterUpdate()
  2.  
  3. '   Check to see if there is already a quoted order with this address
  4.     Me.lblChecking1.Visible = True
  5.     Me.ProgressBar.Visible = True
  6.     Me.Form.Repaint
  7.     If DCount("*", "SC_NEW", "[ADDRESS]='" & Me!ADDRESS & "'") <> 0 Then
  8.         MsgBox "There is already a Pending or Quoted Order with this Address. Please search this NEW ORDERS form.", vbOKOnly
  9.     End If
  10.  
  11.     Me.ProgressBar.Value = 33
  12.  
  13. '   Check to see if there is already an open order with this address
  14.     Me.Form.Repaint
  15.     If DCount("*", "SC_OPEN", "[ADDRESS]='" & Me!ADDRESS & "'") <> 0 Then
  16.         MsgBox "There is already an OPEN ORDER with this Address. Please search the OPEN ORDERS form.", vbOKOnly
  17.     End If
  18.  
  19.     Me.ProgressBar.Value = 66
  20.  
  21. '   Check to see if there is an archived order with this address
  22.     Me.Form.Repaint
  23.     If DCount("*", "SC_ARCH", "[ADDRESS]='" & Me!ADDRESS & "'") <> 0 Then
  24.         MsgBox "This may be a Recert or a Duplicate order. An ARCHIVED ORDER with this address was found.", vbOKOnly
  25.     End If
  26.  
  27.     Me.ProgressBar.Value = 100
  28.  
  29.     Me.lblChecking1.Visible = False
  30.     Me.ProgressBar.Visible = False
  31.     Me.Form.Repaint
  32.  
  33. End Sub
This is the code that creates a file number for a new order. It happens as part of a sequence of things on a button click on the New Orders Form that (uses SC_NEW table as it's control source):
Expand|Select|Wrap|Line Numbers
  1.     If DCount("*", "SC_OPEN", "Left([FILE_NO],4)=Format(Date(),'yyyy')") = 0 _
  2.         And DCount("*", "SC_ARCH", "Left([FILE_NO],4)=Format(Date(),'yyyy')") = 0 Then
  3.  
  4.         Me.FILE_NO = Year(DATE) & "-0001"   '   Reset file number to 1
  5.         DoCmd.RunCommand acCmdSaveRecord
  6.  
  7.     Else
  8.  
  9. '   Find the highest file number for the current year from both tables
  10.         stHighOpen = Nz(DMax("Right([FILE_NO], 4)", "SC_OPEN", "Left([FILE_NO], 4) = Format(DATE(), 'yyyy')"))
  11.             Me.ProgressBar.Value = 30
  12.  
  13.         stHighArch = Nz(DMax("Right([FILE_NO], 4)", "SC_ARCH", "Left([FILE_NO], 4) = Format(DATE(), 'yyyy')"))
  14.             Me.ProgressBar.Value = 40
  15.  
  16. '   Pick the one that is highest
  17.             If stHighOpen < stHighArch Then
  18.                 stHighest = stHighArch
  19.  
  20.             Else
  21.                 stHighest = stHighOpen
  22.  
  23.             End If
  24.  
  25.             Me.ProgressBar.Value = 50
  26.  
  27.         '   Create new file number
  28.         If stHighest > 0 And stHighest < 9 Then
  29.             Me.FILE_NO = Year(DATE) & "-000" & stHighest + 1
  30.         End If
  31.  
  32.         If stHighest > 8 And stHighest < 99 Then
  33.             Me.FILE_NO = Year(DATE) & "-00" & stHighest + 1
  34.         End If
  35.  
  36.         If stHighest > 98 And stHighest < 999 Then
  37.             Me.FILE_NO = Year(DATE) & "-0" & stHighest + 1
  38.         End If
  39.  
  40.         If stHighest > 998 Then
  41.             Me.FILE_NO = Year(DATE) & "-" & stHighest + 1
  42.         End If
Oct 9 '08 #4

tdw
100+
P: 206
tdw
Latest update: for some unknown reason, it sped back up for one day then went back to being slow. The server has been rebooted since, but that didn't speed it up.
Nov 12 '08 #5

tdw
100+
P: 206
tdw
I have narrowed down this problem, I think.

It seems to run fast in the morning, UNTIL another user also opens up the database on their computer. It is a front end/back end, with the back end on a server.

I found several things by googling about similar issues, but can't quite seem to find an answer that fits the situation.
Nov 26 '08 #6

Expert 100+
P: 1,287
It seems that the main operations in the first code are the DCount. Is there a primary key field that you can count, or another one that wouldn't contain nulls, instead of *? Also, I've been using DLookup to do the same thing, does anyone know whether one has any advantage over the other?
Nov 26 '08 #7

tdw
100+
P: 206
tdw
I just checked and I actually had not assigned a primary key.
So now I have made the FILE_NO field a primary key in the tables. The second bit of code contains DMax functions that use this field.
The DCount function is being used on an ADDRESS field, which is not a primary key. In fact, it can contain duplicates. We often go back and do a new survey on a property that we've been to before, but it is a new order each time.
Nov 26 '08 #8

Expert 100+
P: 1,287
I noticed that the criteria for the DCount is the Address, but the field that is being counted is * (all). I have to assume that DCount filters out the records based on the Address first, then counts all the fields, so this shouldn't matter too much, but I'm thinking about trying it out and timing it.
Nov 26 '08 #9

tdw
100+
P: 206
tdw
Ah, I see what you mean. Because of my lack of knowledge, I do a lot of my code writing using help files, examples on help sites, copying code from one part of the database to another and making the changes that I think I need... that would be why I have "*" there as the expression. Would doing that differently speed it up?

For the ADDRESS, I don't actually need a "count" per se, I just need it to pop up the message if it finds a match. I don't care how MANY matches, one is enough.
Nov 26 '08 #10

Expert 100+
P: 1,287
I know that DLookup only returns one value even if there are many matches, but I don't know if it actually stops looking when it finds one. If it does, it might be quicker. I do this: if not isnull(DLookup(...)) then a record already exists.
Have you tried timing any chunks of your code? I had a really slow form with subforms on multiple tabs and timed functions to determine what was really slow and focused on improving those.

I'm just learning VBA myself, so I'm going from the MSDN and office websites mostly. This has DLookup and DCount and all the functions:
http://office.microsoft.com/en-us/ac...288171033.aspx
And this has been the most important reference (though not helpful here):
Access Object Model Reference
Nov 26 '08 #11

tdw
100+
P: 206
tdw
Thanks for the links. I'll check them out and see if they give me a new way to do it.
Part of what's throwing me for a loop is that this did not used to be a problem. Everything ran very fast, even when others were using the database at the same time. Splitting it into front-end/back-end did not slow it down, in fact it sped it up.
Then suddenly it goes very very slow when someone else has the database open, even though they have their own front end. It doesn't seem like the number of records in the tables could be the reason either because there are only a few hundred more records now than there were before. Considering that there are a few thousand records, it doesn't seem like the addition of a few hundred could have had that much effect.
Nov 26 '08 #12

Expert 100+
P: 1,287
Very strange. When your users open the database, do you open and hide all your forms? I read that suggestion somewhere for improving the apparent speed of the application, but it seems like that would establish connections to every table from every user, so I don't do it. With regards to multiple users, I only know a little about record locking, but I think if you were waiting due to that, you would get a message.
Is it the same when anyone is connected or just from certain computers, and is it slow with just 2 connections or more? Is it even slower with more? A network problem may be really hard to track down but I've read about crazy network situations I would never have thought to investigate.
I still think I would stick some timing in. Like
dim startTime, endTime As Single
startTime = Timer
DCount...
endTime = Timer
msgbox "DCount took " & Format$(endTime - startTime, "0.0000") & " seconds."
Then you can see if it changes when another user connects and track down where it's getting slow.
Nov 26 '08 #13

tdw
100+
P: 206
tdw
Well, without adding the timer stuff I can tell you a few things for certain.
I have a progress bar, so I can see generally where it goes slow. I'm not sure adding the timer would help much because I already know the following:

1. It only takes one other person to be on in order to slow it down.
2. It slows down at the DCount, DMax, DLookup lines.
3. What used to take (and still does if I'm the only one on) 5 to 7 seconds, now takes approximately 2 full minutes!
Nov 26 '08 #14

FishVal
Expert 2.5K+
P: 2,653
@tdw
Hello, tdw.

I would suggest you to further localize source of the problem.
  • Create blank databases on clients machines with a simple VBA sub benchmarking just a single domain aggregate function call in a way ChipR has suggested you.
    Expand|Select|Wrap|Line Numbers
    1. debug.print now()
    2. dummy=dlookup(....)
    3. debug.print now()
    4.  
    Connect to existing backend on server and test execution speed.
  • Create a blank database on server. Import table from an existing backend and run the same benchmark on existing frontends and on that new blank frontends.

Regards,
Fish
Nov 27 '08 #15

tdw
100+
P: 206
tdw
Thanks for the advise. I will do that as soon as I have time. If and when I get results, bad or good, I'll post again here.
Dec 1 '08 #16

FishVal
Expert 2.5K+
P: 2,653
Check also this.
IMHO, looks promissing.
Dec 1 '08 #17

tdw
100+
P: 206
tdw
Looks promising to me too. I'll let you know how it works out when I get a chance to try it. Thanks!
Dec 2 '08 #18

tdw
100+
P: 206
tdw
Quick question on this.
I have been reading through the link you gave me, FishVal. I have a switchboard that is always open on the front end, which of course uses a table as it's record source. Isn't that already a persistent recordset connection? If so, it looks like I'll need to do the suggestions further down in the thread.
Dec 3 '08 #19

FishVal
Expert 2.5K+
P: 2,653
Yes. You already have this option applied.
Did you tried benchmarks I've suggested you?
Dec 3 '08 #20

tdw
100+
P: 206
tdw
Not yet. I will as soon as time allows.
Dec 3 '08 #21

Post your reply

Sign in to post your reply or Sign up for a free account.