By using this site, you agree to our updated Privacy Policy and our Terms of Use. Manage your Cookies Settings.
429,515 Members | 1,363 Online
Bytes IT Community
+ Ask a Question
Need help? Post your question and get tips & solutions from a community of 429,515 IT Pros & Developers. It's quick & easy.

Advice on Looking up

P: n/a
Hi all,

Sorry for the long post, but I don't know how else to explain this one.
Just after some advice really as I'm finding it difficult to update a
database. Currently this is running in Access 97 on NT but I do have
the option to upgrade to Access 2002 on XP.

In essence I'm dealing with over 140,000 items of data, these are
spread across about 400 identifiers which have about 44 fields specific
to 8 years.

All the fields across the 8 years specific to an identifier are updated
at once, hence at the moment I detect when an identifier has been
updated and then delete all entries I have for that identifier. After
this I cycle through all identifiers and the required fields/years and
if my database doesn't have a value it goes and retrieves it. It's
this bit which is taking a long time.

Initially I was using DCount to check for the value but have changed
this to something very similar to Trevor Best's infamous tcount, but it
still seems slow to me. For example, 1.5 hours to go through all
140,000+ entries when they are no changes.

I've considered changing the way it works so that I record the updated
identifiers and then only retrieve for them but the advantage with the
current way is that I can simply add a new field or year and all the
information would be dragged in.

So my question is, does anyone have any ideas as to how I might be able
to speed this up? Would migrating to 2002 and XP yield some speed or
can anyone think of another methodology for the program?

Any advice would be received with thanks,

Brad

Nov 13 '05 #1
Share this Question
Share on Google+
3 Replies


P: n/a

Upgrading probably won't help.

There's not a lot of information in this post to go on, however, _do_
make sure that the field you are "Identifing" is indexed.

You also might want to switch to doing update queries on groups of
"Indentified" data.
bradsalmon wrote:
Sorry for the long post, but I don't know how else to explain this one.
Just after some advice really as I'm finding it difficult to update a
database. Currently this is running in Access 97 on NT but I do have
the option to upgrade to Access 2002 on XP.
In essence I'm dealing with over 140,000 items of data, these are
spread across about 400 identifiers which have about 44 fields specific
to 8 years.
All the fields across the 8 years specific to an identifier are updated
at once, hence at the moment I detect when an identifier has been
updated and then delete all entries I have for that identifier. After
this I cycle through all identifiers and the required fields/years and
if my database doesn't have a value it goes and retrieves it. It's
this bit which is taking a long time.
Initially I was using DCount to check for the value but have changed
this to something very similar to Trevor Best's infamous tcount, but it
still seems slow to me. For example, 1.5 hours to go through all
140,000+ entries when they are no changes.
I've considered changing the way it works so that I record the updated
identifiers and then only retrieve for them but the advantage with the
current way is that I can simply add a new field or year and all the
information would be dragged in.
So my question is, does anyone have any ideas as to how I might be able
to speed this up? Would migrating to 2002 and XP yield some speed or
can anyone think of another methodology for the program?


Nov 13 '05 #2

P: n/a
Thanks Chuck

The field is indeed indexed, however I've now also created an index
across all three fields which may have yielded some gains. I'm just
doing some more tests to see if it's consistent or not.

I can provide more information if anyone thinks it would help to find a
solution. Or do people think that over an hour is realistic for Access
to cycle through 140,000+ entries?
Don't think I've ever written anything this big before, so it's more my
inexperience which is making me question it.

Thanks again,

Brad

Nov 13 '05 #3

P: n/a
140,000 entries are a lot, but it's not that much. I just did an
update of 388,000+ records the other day and it took about 5 seconds.
Again, indexing here is the key, along with (occasionally) re-thinking
how you are doing whatever the heck it is you're doing.

Since you haven't given a lot of details of what you're doing, I can
only talk in rather general terms here, but think carefully about what
you are doing. Would working with "Groups" of data limit what are you
doing to less records. For example, if you are walking all 140,000
entries and only working with 3 fields, would grouping them together
result in less records to work with? An update query will handle an
(near) unlimited number of records to make whatever changes you are
doing.
bradsalmon wrote:
Thanks Chuck
The field is indeed indexed, however I've now also created an index
across all three fields which may have yielded some gains. I'm just
doing some more tests to see if it's consistent or not.
I can provide more information if anyone thinks it would help to find a
solution. Or do people think that over an hour is realistic for Access
to cycle through 140,000+ entries?
Don't think I've ever written anything this big before, so it's more my
inexperience which is making me question it.


Nov 13 '05 #4

This discussion thread is closed

Replies have been disabled for this discussion.