473,769 Members | 2,246 Online
Bytes | Software Development & Data Engineering Community
+ Post

Home Posts Topics Members FAQ

Speed issue in Access

Hi all,

I have an Access db with two large tables - 3,100,000 (tblA) and 7,000
(tblB) records. I created a select query using Inner Join by partial
matching two fields (X from tblA and Y from tblB). The size of the db
is about 200MBs.

Now my issue is, the query has been running for over 3 hours already -
I have no idea when it will end. I am using Access 2003. Are there
ways to improve the speed performance? (Also, would the query run
faster if I index the two fields?)

Any helpful suggestion will be appreciated.
- Grasshopper -

Jan 26 '07
27 3247
On Fri, 26 Jan 2007 14:31:04 -0330, Tim Marshall
<TI****@PurpleP andaChasers.Moe rtheriumwrote:

Same here. Last week it didn't get above 65. Brrrr.
-Tom.

>Tom van Stiphout wrote:
>On Fri, 26 Jan 2007 02:08:43 -0330, Tim Marshall
<TI****@Purple PandaChasers.Mo ertheriumwrote:

Me too. Straight into the spines of a Cholla cactus :-)
-Tom.

<Sighwish it was cactus here... we've got a "weather bomb" bearing
down on us for Saturday - which would be great if it were a weekday and
meant a day off work, but on saturday? What a waste... 8)
Jan 27 '07 #11
Are you guys going to help or just chat about the weather?

Jan 27 '07 #12
"SQL Learner" <ex***********@ gmail.comwrote in
news:11******** *************@l 53g2000cwa.goog legroups.com:
Are you guys going to help or just chat about the weather?
I would just chat about the weather, at least until I had some more
information.

1. I want to see the REAL problem. A regular here can spend thirty minutes
or so coming up with a suggested solution for a problem. Often, the
original poster responds that the solution didn't work. Why? Because the
problem presented was just an example, or he/she didn't mention that the
problem occurred in Access 2.0, or that one of the tables is a text file.
Your situation of designating a State through identification of street name
and number seems to be a candidate to be the origin of one of these
frustrating experiences. Surely every State has several 120 Main Streets so
how could 120 Main Street identify the State? Of course, maybe I'm
misunderstandin g what you wrote. That is not my problem; it's your problem.
Perhaps you have a very small amount of data, so that, indeed, there is
only one 120 Main Street and it is in NY. But a record count of 2 100 000
does not seem to suggest this.
Regardless, the problem you present is quite challenging, and is the result
of a very poorly though out design, or some kind of legacy situation.

2. I want to know about the REAL problem. Is a this a one time need or will
the data be used time and time again to answer question after question? If
it's the latter, we need to parse the address data, probably into fields
recognized and used by the US Post Office. Then the State Designation may
become easier and faster. And the answer to other questions down the line
will be as well.

My guess is that Tim and/or Tom would charge you a $100 bucks an hour if
you were paying them. As you aren't, they may chat about whatever they
please.
If you describe your problem again, very precisely and openly, then one of
them, or someone else, may take a run at it.

Jan 27 '07 #13
"SQL Learner" <ex***********@ gmail.comwrote
Thank you, Tom. Here is the SQL:

SELECT *
FROM tblA AS M INNER JOIN tblB AS S ON M.X Like "*" & S.Y & "*";
My guess is that using the LIKE comparison requires a manipulation and
comparison of every record, vastly increasing the time and disk accesses
required.

If you can redesign your tables so that you are comparing indexed fields,
you are likely to see a significant increase in speed, as the initial
comparison will only use the indexes.

If you can come up with another way to solve the business purpose, you may
be able to use a different approach that is faster, as Lyle pointed out.
Because, if you stick with the approach you came up with to deal with the
problem at hand, there may not BE a way to improve the speed.

And, also, as he pointed out, oversimplifying can be even worse than burying
us in irrelevant information. And, the FAQ should, but doesn't state,
"realize that each additional request for clarification that the potential
responders have to make reduces your probability of getting a good answer to
your question". If it is like "pulling one's own teeth" to determine what
you are actually trying to accomplish so they can help you, some will just
lose interest and go answer some other questions.

Larry Linson
Microsoft Access MVP
Jan 27 '07 #14
=============== =============== =============== =====
....... LIKE comparison requires a manipulation and
comparison of every record, vastly increasing the time and disk
accesses
required.
=============== =============== =============== =====
..........we need to parse the address data, probably into fields
recognized and used by the US Post Office. .........
=============== =============== =============== =====

Hi Larry and Lyle,

Thank you. That is the kind of answers (above) I need.

The actually problem is very complex and may take 2000 words to
describe it. People may lose their interest reading such complex spec
(or really ask me to pay $100 per hour - which I definely could not
afford). That is why I would rather present the question in a
simplified way. Once I got some hints, I might be able to solve the
problem myself.

I already got the solutions with the help from another MVP:

UPDATE tblStreet_Value AS S INNER JOIN tblWord_Number AS W ON
S.Street
Like "*" & W.Number & "*" & W.Word & "*" SET S.Value2 = W.Value;

The query works very fast for the sample tables, but when it is
implied to the real data (with 3,200,000 records on tblStreet_Value
and 7,000 on TblWord_Number) , it took forever for the result to come
up.

My objective is, as stated in my initial post, to find a way to speed
up the query process. After experimenting with different type of
queries while I am waiting for answers, I have tried the same methods
that you suggested: Parse the street field into several different
fields and use a "perfect match" instead of using Like for partial
match AND indexed the fields being linked.

For experimental purpose, I used a SELECT query instead of a UPDATE
query. They improved the speed performance, but still not fast enough
(3 hours on my duo CPU laptop).

Holding all other things constant/same (same machine spec and same
query), would the query be faster if it is ran on a SQL Server ? Is
Access 2003 still using the same jet engine as SQL Server?
Would using indexing really slow down the update query (as I read from
somewhere)?
In general, would the use of Group By or Distinct slow down the
query? Which is fast - Group By or Distinct?

(I understand that the slow down may be caused by the poor design of
the database and inadequate SQL statement.)

Sorry guys, next time I will be more thoughtful.

- Your Humble Grasshopper -

Jan 27 '07 #15

"SQL Learner" <ex***********@ gmail.comwrote in message
news:11******** **************@ s48g2000cws.goo glegroups.com.. .
=============== =============== =============== =====
...... LIKE comparison requires a manipulation and
comparison of every record, vastly increasing the time and disk
accesses
required.
=============== =============== =============== =====
.........we need to parse the address data, probably into fields
recognized and used by the US Post Office. .........
=============== =============== =============== =====

Hi Larry and Lyle,

Thank you. That is the kind of answers (above) I need.

The actually problem is very complex and may take 2000 words to
describe it. People may lose their interest reading such complex spec
(or really ask me to pay $100 per hour - which I definely could not
afford). That is why I would rather present the question in a
simplified way. Once I got some hints, I might be able to solve the
problem myself.

I already got the solutions with the help from another MVP:

UPDATE tblStreet_Value AS S INNER JOIN tblWord_Number AS W ON
S.Street
Like "*" & W.Number & "*" & W.Word & "*" SET S.Value2 = W.Value;

The query works very fast for the sample tables, but when it is
implied to the real data (with 3,200,000 records on tblStreet_Value
and 7,000 on TblWord_Number) , it took forever for the result to come
up.

My objective is, as stated in my initial post, to find a way to speed
up the query process. After experimenting with different type of
queries while I am waiting for answers, I have tried the same methods
that you suggested: Parse the street field into several different
fields and use a "perfect match" instead of using Like for partial
match AND indexed the fields being linked.

For experimental purpose, I used a SELECT query instead of a UPDATE
query. They improved the speed performance, but still not fast enough
(3 hours on my duo CPU laptop).

Holding all other things constant/same (same machine spec and same
query), would the query be faster if it is ran on a SQL Server ? Is
Access 2003 still using the same jet engine as SQL Server?
Would using indexing really slow down the update query (as I read from
somewhere)?
In general, would the use of Group By or Distinct slow down the
query? Which is fast - Group By or Distinct?

(I understand that the slow down may be caused by the poor design of
the database and inadequate SQL statement.)

Sorry guys, next time I will be more thoughtful.
To speed this up you need exact matches on fields that are indexed. Parsing the
data up in a SELECT query does not allow you to index on the individual fields
so accomplishes nothing.

You need to create a NEW table where the existing data is split into fields you
can index and then apply exact matching to. Anything less than that and you are
stuck with how long it takes to scan the entire table a row at a time which is
99% driven by the speed of your disk drive access on the PC.

SQL server would not help unless it was on a machine that had faster disk i/o
than the one you have now and you absolutely do NOT want this to run over a
network. That would slow i/o to a point that would be brutal.
--
Rick Brandt, Microsoft Access MVP
Email (as appropriate) to...
RBrandt at Hunter dot com
Jan 27 '07 #16
Thanks, Rick.

I created a new table for the parsed data. For now, I will continue
to use Access.

- Grasshopper -

Jan 27 '07 #17
On Jan 27, 10:22 am, "SQL Learner" <excelmodel...@ gmail.comwrote:
=============== =============== =============== =====
...... LIKE comparison requires a manipulation and
comparison of every record, vastly increasing the time and disk
accesses
required.
=============== =============== =============== =====
.........we need to parse the address data, probably into fields
recognized and used by the US Post Office. .........
=============== =============== =============== =====

Hi Larry and Lyle,

Thank you. That is the kind of answers (above) I need.

The actually problem is very complex and may take 2000 words to
describe it. People may lose their interest reading such complex spec
(or really ask me to pay $100 per hour - which I definely could not
afford). That is why I would rather present the question in a
simplified way. Once I got some hints, I might be able to solve the
problem myself.

I already got the solutions with the help from another MVP:

UPDATE tblStreet_Value AS S INNER JOIN tblWord_Number AS W ON
S.Street
Like "*" & W.Number & "*" & W.Word & "*" SET S.Value2 = W.Value;

The query works very fast for the sample tables, but when it is
implied to the real data (with 3,200,000 records on tblStreet_Value
and 7,000 on TblWord_Number) , it took forever for the result to come
up.

My objective is, as stated in my initial post, to find a way to speed
up the query process. After experimenting with different type of
queries while I am waiting for answers, I have tried the same methods
that you suggested: Parse the street field into several different
fields and use a "perfect match" instead of using Like for partial
match AND indexed the fields being linked.

For experimental purpose, I used a SELECT query instead of a UPDATE
query. They improved the speed performance, but still not fast enough
(3 hours on my duo CPU laptop).

Holding all other things constant/same (same machine spec and same
query), would the query be faster if it is ran on a SQL Server ? Is
Access 2003 still using the same jet engine as SQL Server?
Would using indexing really slow down the update query (as I read from
somewhere)?
In general, would the use of Group By or Distinct slow down the
query? Which is fast - Group By or Distinct?

(I understand that the slow down may be caused by the poor design of
the database and inadequate SQL statement.)
If you've decided to parse the address field why not tell us? Then
give us a few examples of what the text in an address field really
looks like. Then ask, "What's your best solution on how to parse the
address field and populate the (more) atomic fields?" I'm guessng I
would use a UDF with Regular Expressions but others might have better,
faster solutions.

IMO, the notion that indexing will drastically slow down an update
query is greatly exaggerated. Of course, in your case, as you will not
have indexed the new fields (you may want to later) it will not slow
things at all. (The reason an index might slow down an update is that
a new index key and pointer must be written for each new or updated
value; if there's no index on the field; there's no new key and hence,
no slowdown).

SQL-Server is likely to do an update based on an existing field at the
Server level. As I recall, SQL Server 2005 let's us use Regular
Expressions with a T-SQL UDF. If that's the case we should be able to
parse out the data in ten minutes. But I've never done this.
In any case SQL-Server might be faster. As it would do it without
stopping your local machine at all, you are unlikely to notice how
long it takes, unless you are waiting breathlessly for results.

Jan 27 '07 #18
Thanks, Lyle.

I think the best solution is to parse the string into different fields
and index them. (It is good that none of these fields will be the
updated fields.)

It is quite a challenge to parse the following format
East 123 ABC Street" or "S. 100 Apple Road

into

Field1 Field2 Field3
-------- --------- ----------------
East 123 ABC Street
S. 100 Apple Road

since there is no comment "delimited character" to use.

May be one can use the following logic - split the string using the
first space location before the number (field1), extract the number
using the first and second spaces (field2), and extract the last piece
of data after the last number digest + space (field3). How can I
write T-SQL UDFs for these three fields in Access?

- Grasshopper -

Jan 27 '07 #19
On Jan 27, 10:29 am, "Rick Brandt" <rickbran...@ho tmail.comwrote:
"SQL Learner" <excelmodel...@ gmail.comwrote in messagenews:11* *************** ******@s48g2000 cws.googlegroup s.com...
SQL server would not help unless it was on a machine that had faster disk i/o
than the one you have now and you absolutely do NOT want this to run over a
network. That would slow i/o to a point that would be brutal.
Please, expand on this. It seems you are relating SQL Server to
running over a network and suggesting that running the query in SQL-
Server "over a network", whatever that would mean, would be slow. I
doubt very much if that is what you mean.

Jan 27 '07 #20

This thread has been closed and replies have been disabled. Please start a new discussion.

Similar topics

11
3314
by: Markku Uttula | last post by:
I think I'm doing something wrong. I'm able to connect to Oracle just fine, execute queries and all, but I'm having serious problems with the speed :( For example, the following PHP-script on my machine executes about 6 seconds: <? $db_conn = ocilogon("my_username", "my_password", "my_database"); $loop_count = 1000;
28
2604
by: Maboroshi | last post by:
Hi I am fairly new to programming but not as such that I am a total beginner From what I understand C and C++ are faster languages than Python. Is this because of Pythons ability to operate on almost any operating system? Or is there many other reasons why? I understand there is ansi/iso C and C++ and that ANSI/ISO Code will work on any system If this is the reason why, than why don't developers create specific Python Distrubutions...
10
7045
by: Fabian | last post by:
Are there any speed issues in javascript with having really large arrays? I know if the array gets large enough, bandwidth and download time can be an issue, but does it take inordinate amounts of time to access a really large array? -- -- Fabian Visit my website often and for long periods!
8
5020
by: WindAndWaves | last post by:
I was wondering if anyone knows what is faster: sql or recordset updates??? And if there are any other differences. At the moment I use docmd.runsql throughout my database. --- Please immediately let us know (by phone or return email) if (a) this email contains a virus
60
10154
by: Neil | last post by:
I have a situation with an ODBC linked view in an Access 2000 MDB with a SQL 7 back end. The view is scrolling very slowly. However, if I open the view in an ADP file, it scrolls quickly. I needed to use an ODBC link for the view because it needs to be editable. Otherwise, I would have used a pass-through query. In previous discussions about using an MDB file vs. an ADP file as a front end for SQL Server, the impression I got was that...
6
2031
by: Ham | last post by:
Yeah, Gotto work with my VB.Net graphic application for days, do any possible type of code optimization, check for unhandled errors and finally come up with sth that can't process 2D graphics and photos at an acceptable speed. I have heard things about the virtual machine of Mr. Net, that it can run my app at a high speed....but could never compare it with Java VM and its speed. Then, what should i do? Go and learn C++ ? Do i have time for...
2
2780
by: jphelan | last post by:
Ever since I successfully applied some techniques for increasing the speed of my 17 meg. Application; it has only made me hunger for more. First, let me list what I have done so far: 1. Split the application and database into front and back-ends, 2. Increased the load-time of my application by "pre-loading my heaviest forms when the database is initially opened, 3. When forms do open, under, "Filter Lookup" I use, "Never", 4. I set...
1
1732
by: tc | last post by:
Hi. I have a customer who is running our .net application, multiple clients connecting to an Access database. This ONE customer reports a great speed degredation when more than one client is connected. First client logs on and the system runs correctly. Second client logs on, both connected clients now run very slowly when retrieving data. Third client logs on, the speed on all three clients is no degraded even more. Etc.
3
10667
by: Arnie | last post by:
Folks, We ran into a pretty significant performance penalty when casting floats. We've identified a code workaround that we wanted to pass along but also was wondering if others had experience with this and if there is a better solution. -jeff
0
9579
marktang
by: marktang | last post by:
ONU (Optical Network Unit) is one of the key components for providing high-speed Internet services. Its primary function is to act as an endpoint device located at the user's premises. However, people are often confused as to whether an ONU can Work As a Router. In this blog post, we’ll explore What is ONU, What Is Router, ONU & Router’s main usage, and What is the difference between ONU and Router. Let’s take a closer look ! Part I. Meaning of...
0
10206
Oralloy
by: Oralloy | last post by:
Hello folks, I am unable to find appropriate documentation on the type promotion of bit-fields when using the generalised comparison operator "<=>". The problem is that using the GNU compilers, it seems that the internal comparison operator "<=>" tries to promote arguments from unsigned to signed. This is as boiled down as I can make it. Here is my compilation command: g++-12 -std=c++20 -Wnarrowing bit_field.cpp Here is the code in...
0
9851
tracyyun
by: tracyyun | last post by:
Dear forum friends, With the development of smart home technology, a variety of wireless communication protocols have appeared on the market, such as Zigbee, Z-Wave, Wi-Fi, Bluetooth, etc. Each protocol has its own unique characteristics and advantages, but as a user who is planning to build a smart home system, I am a bit confused by the choice of these technologies. I'm particularly interested in Zigbee because I've heard it does some...
0
8863
agi2029
by: agi2029 | last post by:
Let's talk about the concept of autonomous AI software engineers and no-code agents. These AIs are designed to manage the entire lifecycle of a software development project—planning, coding, testing, and deployment—without human intervention. Imagine an AI that can take a project description, break it down, write the code, debug it, and then launch it, all on its own.... Now, this would greatly impact the work of software developers. The idea...
1
7403
isladogs
by: isladogs | last post by:
The next Access Europe User Group meeting will be on Wednesday 1 May 2024 starting at 18:00 UK time (6PM UTC+1) and finishing by 19:30 (7.30PM). In this session, we are pleased to welcome a new presenter, Adolph Dupré who will be discussing some powerful techniques for using class modules. He will explain when you may want to use classes instead of User Defined Types (UDT). For example, to manage the data in unbound forms. Adolph will...
0
6662
by: conductexam | last post by:
I have .net C# application in which I am extracting data from word file and save it in database particularly. To store word all data as it is I am converting the whole word file firstly in HTML and then checking html paragraph one by one. At the time of converting from word file to html my equations which are in the word document file was convert into image. Globals.ThisAddIn.Application.ActiveDocument.Select();...
0
5441
by: adsilva | last post by:
A Windows Forms form does not have the event Unload, like VB6. What one acts like?
1
3949
by: 6302768590 | last post by:
Hai team i want code for transfer the data from one system to another through IP address by using C# our system has to for every 5mins then we have to update the data what the data is updated we have to send another system
3
2811
bsmnconsultancy
by: bsmnconsultancy | last post by:
In today's digital era, a well-designed website is crucial for businesses looking to succeed. Whether you're a small business owner or a large corporation in Toronto, having a strong online presence can significantly impact your brand's success. BSMN Consultancy, a leader in Website Development in Toronto offers valuable insights into creating effective websites that not only look great but also perform exceptionally well. In this comprehensive...

By using Bytes.com and it's services, you agree to our Privacy Policy and Terms of Use.

To disable or enable advertisements and analytics tracking please visit the manage ads & tracking page.