473,422 Members | 2,235 Online
Bytes | Software Development & Data Engineering Community
Post Job

Home Posts Topics Members FAQ

Join Bytes to post your question to a community of 473,422 software developers and data experts.

Table exceeding one million rows crashes Access

Using MS Access 2K, I have a client with a number of seperate customer
tables for each country, approx 50 tables, stored on a SQL Server
backend.

I cleaned up the data in the tables and inserted all the records into
one table and ended up with a table with 1,200,000 rows that looks like
this:

CustomerID int (PK)
CustomerName nvarchar(100)
CountryID int
ProvinceID int
Province nvarchar(50)

I created a stored procedure to filter the table that looks like this:

@CountryID int,
@ProvinceID int,
@LeftCustomerName nvarchar(10)

AS
set nocount on

DECLARE @strLength smallint
SELECT @strLength = LEN(@LeftCustomerName)

SELECT
CustomerID,
CustomerName,
Province
FROM
dbo.tblWorldCustomers
WHERE
(CountryID = @CountryID)
AND
(ProvinceID = @ProvinceID OR @ProvinceID Is Null)
AND
(LEFT(CustomerName,@strLength) = @LeftCustomerName)
ORDER BY
CustomerName, Province

The query allows the user to filter the customers by country,
optionally by province, and by up to the first ten letters of the
customer name. Fine and dandy when there were 100,000 records but as I
kept inserting records, this query now starts locking up my machine. On
the first run of the query it is extremely slow -- if it completes at
all without crashing the Access app. On second and subsequent queries,
it performs acceptably (though a little sluggish - maybe returning
records in three seconds or so - certainly it will be much slower if
delpoyed over the client's VPN WAN.)

I have an index on CountryID, and index on ProvinceID and I have played
with removing and inserting indexes on the above and on the
CustomerName column, but can't seem to make this perform any better.

The bad news is that I have another 2 million rows to insert!!!

I have not had to deal with tables with so many rows before using an
Access front end. Am I missing something here that would improve this?

Any help is greatly appreciated.
lq

Dec 16 '05 #1
17 2861
I would try

AS

SELECT @LeftCustomerName = @LeftCustomerName) + '%'

SELECT
CustomerID,
CustomerName,
Province
FROM
dbo.tblWorldCustomers
WHERE
(CountryID = @CountryID)
AND
(ProvinceID = @ProvinceID OR @ProvinceID Is Null)
AND
CustomerName LIKE @LeftCustomerName
ORDER BY
CustomerName, Province

--
Lyle Fairfield
Dec 16 '05 #2
Sorry, typo

SELECT @LeftCustomerName = @LeftCustomerName + '%'
--
Lyle Fairfield
Dec 16 '05 #3
Lyle,
Yes!
This is many times faster!!!!

Now I just need to figure out how to make the first time the query is
run faster.

Much thanks,
lq

Dec 16 '05 #4
Just wondering how many rows I might get away with decent performance
using a query like this?
lq

Dec 16 '05 #5
Lauren Quantrell wrote:
Just wondering how many rows I might get away with decent performance
using a query like this?
lq


I hope you will tell us.

--
Lyle Fairfield
Dec 16 '05 #6
I have no idea but I would certainly experiment with whatever came to
mind.

Here's one thing.

Create 26 indexed views which effectively split the table into 26
parts, based on character beginning name, that is one for a, one for b,
one for c, ... one for z. Use VBA, T_SQL or whatever to direct your
SELECT against the appropriate view.

The views could be created something like this:

With CurrentProject.Connection

..Execute "CREATE VIEW mSchools WITH SCHEMABINDING AS SELECT
fldSchoolID, fldSchoolName, fldAdministratorPassword,
fldViewerPassword, fldDomain FROM dbo.tblSCHOOLS WHERE fldSchoolName
LIKE 'M%'"

..Execute "SET ARITHABORT ON"

..Execute "CREATE UNIQUE CLUSTERED INDEX mSchoolsID ON mSchools
(fldSchoolID)"

..Execute "CREATE INDEX mSchoolsNames on mSchools (fldSchoolName)"

..Execute "SET ARITHABORT OFF"

End With

and the Select (for "m") would be something like this:

"SELECT * FROM mSchools WHERE fldSchoolName LIKE 'Martin%'")

Would this make things faster? I don't know. It could make the whole
server run noticeably slower, with 52 more indexes to deal with.
Perhaps, someone can tell us what SHOULD happen. Only you can tell us
what does happen.

In any case

With CurrentProject.Connection
..Execute "DROP VIEW mSchools"
End With

should be a quick cure.

Dec 16 '05 #7
Tried it with indexed views and it's definitely slower than just
running the queries against the table itself.
lq

Dec 16 '05 #8
Lauren Quantrell wrote:
Tried it with indexed views and it's definitely slower than just
running the queries against the table itself.
lq


Ouch! So much for that idea ....

--
Lyle Fairfield
Dec 16 '05 #9
Hmm, I would take the basic SQL you've got and play about with filling a
temp table first. I would expect to see a performance improvement with
something like:-

AS
@LENLeftCustName int

SELECT @LENLeftCustName = LEN(@LeftCustomerName )

CREATE TABLE #tCusts(
CustomerID int,
LCustName nvarchar(10),
CustomerName nvarchar(100),
CountryID int,
ProvinceID int,
Province nvarchar(50),
)

INSERT INTO #tCusts(
CustomerID,
CustomerName,
LCustName,
CountryID,
ProvinceID,
Province
)
SELECT
CustomerID,
CustomerName,
LEFT(CustomerName, @LENLeftCustName)
CountryID,
ProvinceID,
Province
FROM
dbo.tblWorldCustomers
WHERE
(CountryID = @CountryID)
AND
(ProvinceID = @ProvinceID OR @ProvinceID Is Null)

CREATE INDEX i_tCusts1 ON #tCusts(LCustName)
CREATE INDEX i_tCusts2 ON #tCusts(CustomerName, Province)

SELECT
CustomerID,
CustomerName,
Province
FROM
#tCusts
WHERE
LCustName = @LeftCustomerName
ORDER BY
CustomerName, Province
--
Terry Kreft

"Lauren Quantrell" <la*************@hotmail.com> wrote in message
news:11**********************@g44g2000cwa.googlegr oups.com...
Tried it with indexed views and it's definitely slower than just
running the queries against the table itself.
lq

Dec 16 '05 #10
Terry,
Thanks for the thought on this.
Populating the temp table means in many cases inserting 100,000 plus
rows, and worse, then creating the indexes, so the performance on this
method was dismal! So in this particular case it's not a workable
solution.

Dec 17 '05 #11
Lauren,
I didn't mean that what I posted was definitive you do have to play around
with how you fill the table and the indexing to see if you can get an
improvement.

I have done this in the real world and seen immense improvements in
performance, but of course it depends on th table structure/ indexing as to
whether an improvement occurs, if it's not working then it's back to the
drawing board.

BTW have you looked at the estimated execution plan in query analyser to see
if that gives any hints on areas where you could improve performance?

As a final note; you do realise that you may end up denormalising and
splitting these records out to multiple tables again and therefore ending up
with "data" being stored as the tables rather than the in the tables don't
you?

Although denormalising is not ideal it is acceptable ,so long as it has been
considered thoroughly as the way to go and so long as it is documented as to
why you went that way.

--
Terry Kreft

"Lauren Quantrell" <la*************@hotmail.com> wrote in message
news:11**********************@g44g2000cwa.googlegr oups.com...
Terry,
Thanks for the thought on this.
Populating the temp table means in many cases inserting 100,000 plus
rows, and worse, then creating the indexes, so the performance on this
method was dismal! So in this particular case it's not a workable
solution.

Dec 17 '05 #12
Terry,
I've been playing with this every which way and it's starting to look
like I might have to split this data back into multiple tables again.
It's fairly logical as each table represents a country.
Thanks.
lq

Dec 17 '05 #13
"Lauren Quantrell" <la*************@hotmail.com> wrote in
news:11**********************@o13g2000cwo.googlegr oups.com:
I've been playing with this every which way and it's starting to
look like I might have to split this data back into multiple
tables again. It's fairly logical as each table represents a
country.


I have some difficulty understanding exactly what it is that you're
trying to do that could cause this kind of performance drag. I have
pure Access apps that deal with hundreds of thousands of records on
a regular basis, and there are no issues. this includes summarizing
data, not just retrieving small sets of records, and perfromance is
acceptable (if not stellar).

I can't quite understand why you should be having such a performance
bottleneck. Really, with server databases, a million records really
oughtn't be a big deal, even if you're summarizing that data.

--
David W. Fenton http://www.dfenton.com/
usenet at dfenton dot com http://www.dfenton.com/DFA/
Dec 17 '05 #14
One thing I have realized is that my SQL Server is running on the same
machine as my Access app in my development environment and this may be
having a major impact on the poor results I'm getting. I have not moved
this massive table to a client server yet - this table is now approx 4
million records. Like you, a couple hundred thousand rows I've never
had a problem with. This is the first time I've dealt with tables in
excess of a million rows, and now pushing 4 million rows and I'm seeing
such a bottleneck running simple select queries to make it unusable.

Dec 18 '05 #15

I have several databases that have more then a million records in
them, and a few more over the years that have had several million
records in them.

The "trick" to this is to transfer as much processing as you can to
the server. Don't select against a table, select against a _view_ or
execute a stored procedure. (More then a few DBAs I know don't even
allow selecting against a table!)

Of course, proper indexing is required. Index the field(s) your
selecting on, and you should see a noticeable performance increase.
On 18 Dec 2005 09:44:11 -0800, "Lauren Quantrell"
<la*************@hotmail.com> wrote:
One thing I have realized is that my SQL Server is running on the same
machine as my Access app in my development environment and this may be
having a major impact on the poor results I'm getting. I have not moved
this massive table to a client server yet - this table is now approx 4
million records. Like you, a couple hundred thousand rows I've never
had a problem with. This is the first time I've dealt with tables in
excess of a million rows, and now pushing 4 million rows and I'm seeing
such a bottleneck running simple select queries to make it unusable.

--
Drive C: Error. (A)bort (R)etry (S)mack The Darned Thing

Dec 19 '05 #16
Lauren,
It may be an idea to ask in comp.databases.ms-sqlserver or one of the MS SQL
groups for some suggestions.
--
Terry Kreft

"Lauren Quantrell" <la*************@hotmail.com> wrote in message
news:11**********************@f14g2000cwb.googlegr oups.com...
One thing I have realized is that my SQL Server is running on the same
machine as my Access app in my development environment and this may be
having a major impact on the poor results I'm getting. I have not moved
this massive table to a client server yet - this table is now approx 4
million records. Like you, a couple hundred thousand rows I've never
had a problem with. This is the first time I've dealt with tables in
excess of a million rows, and now pushing 4 million rows and I'm seeing
such a bottleneck running simple select queries to make it unusable.

Dec 19 '05 #17
Thanks. One of the issues is that any but the most simple select query
times out in Access - trying to run an update query times out every
time.

Dec 19 '05 #18

This thread has been closed and replies have been disabled. Please start a new discussion.

Similar topics

5
by: Jim Garrison | last post by:
Scenario: 1) Create a GLOBAL TEMPORARY table and populate it with one (1) row. 2) Join that table to another with about 1 million rows. The join condition selects a few hundred rows. ...
1
by: Jay | last post by:
Hi I have a huge table with over 100million records and on regular basis ineed to delete nearly a million records and insert a million records. Currently I delete indexes before going through the...
18
by: Jeff Boes | last post by:
I'm sure this is a concept that's been explored here. I have a table (fairly simple, just two columns, one of which is a 32-digit checksum) with several million rows (currently, about 7 million)....
7
by: Bing Wu | last post by:
Hi Folks, I have a very large table containing 170 million rows of coordinats: CREATE TABLE "DB2ADMIN"."COORDINATE" ( "FID" INTEGER NOT NULL , "AID" INTEGER NOT NULL , "X" REAL NOT NULL ,...
3
by: Joachim Klassen | last post by:
Hi all, first apologies if this question looks the same as another one I recently posted - its a different thing but for the same szenario:-). We are having performance problems when...
6
by: sql_server_user | last post by:
I'm trying to copy all 440 million rows from one table in my SQL Server 2005 db to another table with a different clustering scheme. After a few test inserts that were successful (up to a million...
2
by: shsandeep | last post by:
Hi all, I have heard and read this many times: "Partitions should only be used for 'very large' tables". What actually determines whether a table is 'very large' or not? I have tables containing...
4
by: Hemant Shah | last post by:
Folks, Our client has a program that browses whole table from begining to end. The table has 1 million rows in it. REORGCHK does not show any problems. It has unique index defined on KEY0...
2
fahimghauri
by: fahimghauri | last post by:
Hi all, I DB2 v8.1 FixPack 15 on Linux. I have a table which has over 6 Million rows When I query that table it takes too much time even if I query for first 10 rows only. It also gives error "Log...
1
by: nemocccc | last post by:
hello, everyone, I want to develop a software for my android phone for daily needs, any suggestions?
0
by: Hystou | last post by:
There are some requirements for setting up RAID: 1. The motherboard and BIOS support RAID configuration. 2. The motherboard has 2 or more available SATA protocol SSD/HDD slots (including MSATA, M.2...
0
by: Hystou | last post by:
Most computers default to English, but sometimes we require a different language, especially when relocating. Forgot to request a specific language before your computer shipped? No problem! You can...
0
Oralloy
by: Oralloy | last post by:
Hello folks, I am unable to find appropriate documentation on the type promotion of bit-fields when using the generalised comparison operator "<=>". The problem is that using the GNU compilers,...
0
jinu1996
by: jinu1996 | last post by:
In today's digital age, having a compelling online presence is paramount for businesses aiming to thrive in a competitive landscape. At the heart of this digital strategy lies an intricately woven...
1
by: Hystou | last post by:
Overview: Windows 11 and 10 have less user interface control over operating system update behaviour than previous versions of Windows. In Windows 11 and 10, there is no way to turn off the Windows...
0
tracyyun
by: tracyyun | last post by:
Dear forum friends, With the development of smart home technology, a variety of wireless communication protocols have appeared on the market, such as Zigbee, Z-Wave, Wi-Fi, Bluetooth, etc. Each...
0
isladogs
by: isladogs | last post by:
The next Access Europe User Group meeting will be on Wednesday 1 May 2024 starting at 18:00 UK time (6PM UTC+1) and finishing by 19:30 (7.30PM). In this session, we are pleased to welcome a new...
0
by: conductexam | last post by:
I have .net C# application in which I am extracting data from word file and save it in database particularly. To store word all data as it is I am converting the whole word file firstly in HTML and...

By using Bytes.com and it's services, you agree to our Privacy Policy and Terms of Use.

To disable or enable advertisements and analytics tracking please visit the manage ads & tracking page.