473,396 Members | 1,608 Online
Bytes | Software Development & Data Engineering Community
Post Job

Home Posts Topics Members FAQ

Join Bytes to post your question to a community of 473,396 software developers and data experts.

Advice on Looking up

Hi all,

Sorry for the long post, but I don't know how else to explain this one.
Just after some advice really as I'm finding it difficult to update a
database. Currently this is running in Access 97 on NT but I do have
the option to upgrade to Access 2002 on XP.

In essence I'm dealing with over 140,000 items of data, these are
spread across about 400 identifiers which have about 44 fields specific
to 8 years.

All the fields across the 8 years specific to an identifier are updated
at once, hence at the moment I detect when an identifier has been
updated and then delete all entries I have for that identifier. After
this I cycle through all identifiers and the required fields/years and
if my database doesn't have a value it goes and retrieves it. It's
this bit which is taking a long time.

Initially I was using DCount to check for the value but have changed
this to something very similar to Trevor Best's infamous tcount, but it
still seems slow to me. For example, 1.5 hours to go through all
140,000+ entries when they are no changes.

I've considered changing the way it works so that I record the updated
identifiers and then only retrieve for them but the advantage with the
current way is that I can simply add a new field or year and all the
information would be dragged in.

So my question is, does anyone have any ideas as to how I might be able
to speed this up? Would migrating to 2002 and XP yield some speed or
can anyone think of another methodology for the program?

Any advice would be received with thanks,

Brad

Nov 13 '05 #1
3 1351

Upgrading probably won't help.

There's not a lot of information in this post to go on, however, _do_
make sure that the field you are "Identifing" is indexed.

You also might want to switch to doing update queries on groups of
"Indentified" data.
bradsalmon wrote:
Sorry for the long post, but I don't know how else to explain this one.
Just after some advice really as I'm finding it difficult to update a
database. Currently this is running in Access 97 on NT but I do have
the option to upgrade to Access 2002 on XP.
In essence I'm dealing with over 140,000 items of data, these are
spread across about 400 identifiers which have about 44 fields specific
to 8 years.
All the fields across the 8 years specific to an identifier are updated
at once, hence at the moment I detect when an identifier has been
updated and then delete all entries I have for that identifier. After
this I cycle through all identifiers and the required fields/years and
if my database doesn't have a value it goes and retrieves it. It's
this bit which is taking a long time.
Initially I was using DCount to check for the value but have changed
this to something very similar to Trevor Best's infamous tcount, but it
still seems slow to me. For example, 1.5 hours to go through all
140,000+ entries when they are no changes.
I've considered changing the way it works so that I record the updated
identifiers and then only retrieve for them but the advantage with the
current way is that I can simply add a new field or year and all the
information would be dragged in.
So my question is, does anyone have any ideas as to how I might be able
to speed this up? Would migrating to 2002 and XP yield some speed or
can anyone think of another methodology for the program?


Nov 13 '05 #2
Thanks Chuck

The field is indeed indexed, however I've now also created an index
across all three fields which may have yielded some gains. I'm just
doing some more tests to see if it's consistent or not.

I can provide more information if anyone thinks it would help to find a
solution. Or do people think that over an hour is realistic for Access
to cycle through 140,000+ entries?
Don't think I've ever written anything this big before, so it's more my
inexperience which is making me question it.

Thanks again,

Brad

Nov 13 '05 #3
140,000 entries are a lot, but it's not that much. I just did an
update of 388,000+ records the other day and it took about 5 seconds.
Again, indexing here is the key, along with (occasionally) re-thinking
how you are doing whatever the heck it is you're doing.

Since you haven't given a lot of details of what you're doing, I can
only talk in rather general terms here, but think carefully about what
you are doing. Would working with "Groups" of data limit what are you
doing to less records. For example, if you are walking all 140,000
entries and only working with 3 fields, would grouping them together
result in less records to work with? An update query will handle an
(near) unlimited number of records to make whatever changes you are
doing.
bradsalmon wrote:
Thanks Chuck
The field is indeed indexed, however I've now also created an index
across all three fields which may have yielded some gains. I'm just
doing some more tests to see if it's consistent or not.
I can provide more information if anyone thinks it would help to find a
solution. Or do people think that over an hour is realistic for Access
to cycle through 140,000+ entries?
Don't think I've ever written anything this big before, so it's more my
inexperience which is making me question it.


Nov 13 '05 #4

This thread has been closed and replies have been disabled. Please start a new discussion.

Similar topics

4
by: Socheat Sou | last post by:
After a brief, but informative, discussion on Freenode's #zope chatroom, I was advised to consult the gurus on c.l.p. I'm working for a small company who is in desperate need to rewrite it's...
14
by: Jason Daly | last post by:
I'm a freshman at college as a computer science major. I'm not sure it has what I want. Does anyone know if a major commonly exists in web design (focusing in server side languages)? I want to...
2
by: Alex | last post by:
Subject: Looking for an XML (database-based) Query Reporting Tool/advice First off, let me apologize if this thread is somewhat off topic... PLEASE REPLY TO: xml@solex-bi.com I am looking...
4
by: pcunix | last post by:
I'm looking for general advice, pointers to web pages, books, whatever. I have a moderately successful web site. The major complaint that I get, time after time, is "It's UUUGLY" As I have...
4
by: Marquisha | last post by:
If this is off-topic, please forgive me. But I thought this might be the perfect spot to get some advice about how to proceed with a project. Working on a Web site design for a nonprofit...
9
by: Laban | last post by:
Hi, I find myself using static methods more than I probably should, so I am looking for some advice on a better approach. For example, I am writing an app that involves quite a bit of database...
13
by: Alan Silver | last post by:
Hello, MSDN (amongst other places) is full of helpful advice on ways to do data access, but they all seem geared to wards enterprise applications. Maybe I'm in a minority, but I don't have those...
3
by: provowallis | last post by:
I'm new to this board so I hope this reqest isn't out of line, but I'm looking for some general advice about creating links in online books. If the link target didn't involve PIs I don't think I'd...
2
by: adodotnet20 | last post by:
Hi, I need to create an application in C# that will be fairly simple from a UI prospective and that will be able to retrieve some information from a db. The overall look and feel should be nice...
3
by: mesut | last post by:
Hi colleagues, I need your advice... I have approx 1,5 years experience with ASP.NET/VB.NET 2005 and I have to switch over into C# 2005 language. I don't have experience with C# 2005...
0
by: Charles Arthur | last post by:
How do i turn on java script on a villaon, callus and itel keypad mobile phone
0
by: ryjfgjl | last post by:
In our work, we often receive Excel tables with data in the same format. If we want to analyze these data, it can be difficult to analyze them because the data is spread across multiple Excel files...
0
by: emmanuelkatto | last post by:
Hi All, I am Emmanuel katto from Uganda. I want to ask what challenges you've faced while migrating a website to cloud. Please let me know. Thanks! Emmanuel
0
BarryA
by: BarryA | last post by:
What are the essential steps and strategies outlined in the Data Structures and Algorithms (DSA) roadmap for aspiring data scientists? How can individuals effectively utilize this roadmap to progress...
1
by: nemocccc | last post by:
hello, everyone, I want to develop a software for my android phone for daily needs, any suggestions?
1
by: Sonnysonu | last post by:
This is the data of csv file 1 2 3 1 2 3 1 2 3 1 2 3 2 3 2 3 3 the lengths should be different i have to store the data by column-wise with in the specific length. suppose the i have to...
0
by: Hystou | last post by:
There are some requirements for setting up RAID: 1. The motherboard and BIOS support RAID configuration. 2. The motherboard has 2 or more available SATA protocol SSD/HDD slots (including MSATA, M.2...
0
marktang
by: marktang | last post by:
ONU (Optical Network Unit) is one of the key components for providing high-speed Internet services. Its primary function is to act as an endpoint device located at the user's premises. However,...
0
by: Hystou | last post by:
Most computers default to English, but sometimes we require a different language, especially when relocating. Forgot to request a specific language before your computer shipped? No problem! You can...

By using Bytes.com and it's services, you agree to our Privacy Policy and Terms of Use.

To disable or enable advertisements and analytics tracking please visit the manage ads & tracking page.