By using this site, you agree to our updated Privacy Policy and our Terms of Use. Manage your Cookies Settings.
459,261 Members | 1,688 Online
Bytes IT Community
+ Ask a Question
Need help? Post your question and get tips & solutions from a community of 459,261 IT Pros & Developers. It's quick & easy.

Table exceeding one million rows crashes Access

P: n/a
Using MS Access 2K, I have a client with a number of seperate customer
tables for each country, approx 50 tables, stored on a SQL Server
backend.

I cleaned up the data in the tables and inserted all the records into
one table and ended up with a table with 1,200,000 rows that looks like
this:

CustomerID int (PK)
CustomerName nvarchar(100)
CountryID int
ProvinceID int
Province nvarchar(50)

I created a stored procedure to filter the table that looks like this:

@CountryID int,
@ProvinceID int,
@LeftCustomerName nvarchar(10)

AS
set nocount on

DECLARE @strLength smallint
SELECT @strLength = LEN(@LeftCustomerName)

SELECT
CustomerID,
CustomerName,
Province
FROM
dbo.tblWorldCustomers
WHERE
(CountryID = @CountryID)
AND
(ProvinceID = @ProvinceID OR @ProvinceID Is Null)
AND
(LEFT(CustomerName,@strLength) = @LeftCustomerName)
ORDER BY
CustomerName, Province

The query allows the user to filter the customers by country,
optionally by province, and by up to the first ten letters of the
customer name. Fine and dandy when there were 100,000 records but as I
kept inserting records, this query now starts locking up my machine. On
the first run of the query it is extremely slow -- if it completes at
all without crashing the Access app. On second and subsequent queries,
it performs acceptably (though a little sluggish - maybe returning
records in three seconds or so - certainly it will be much slower if
delpoyed over the client's VPN WAN.)

I have an index on CountryID, and index on ProvinceID and I have played
with removing and inserting indexes on the above and on the
CustomerName column, but can't seem to make this perform any better.

The bad news is that I have another 2 million rows to insert!!!

I have not had to deal with tables with so many rows before using an
Access front end. Am I missing something here that would improve this?

Any help is greatly appreciated.
lq

Dec 16 '05 #1
Share this Question
Share on Google+
17 Replies


P: n/a
I would try

AS

SELECT @LeftCustomerName = @LeftCustomerName) + '%'

SELECT
CustomerID,
CustomerName,
Province
FROM
dbo.tblWorldCustomers
WHERE
(CountryID = @CountryID)
AND
(ProvinceID = @ProvinceID OR @ProvinceID Is Null)
AND
CustomerName LIKE @LeftCustomerName
ORDER BY
CustomerName, Province

--
Lyle Fairfield
Dec 16 '05 #2

P: n/a
Sorry, typo

SELECT @LeftCustomerName = @LeftCustomerName + '%'
--
Lyle Fairfield
Dec 16 '05 #3

P: n/a
Lyle,
Yes!
This is many times faster!!!!

Now I just need to figure out how to make the first time the query is
run faster.

Much thanks,
lq

Dec 16 '05 #4

P: n/a
Just wondering how many rows I might get away with decent performance
using a query like this?
lq

Dec 16 '05 #5

P: n/a
Lauren Quantrell wrote:
Just wondering how many rows I might get away with decent performance
using a query like this?
lq


I hope you will tell us.

--
Lyle Fairfield
Dec 16 '05 #6

P: n/a
I have no idea but I would certainly experiment with whatever came to
mind.

Here's one thing.

Create 26 indexed views which effectively split the table into 26
parts, based on character beginning name, that is one for a, one for b,
one for c, ... one for z. Use VBA, T_SQL or whatever to direct your
SELECT against the appropriate view.

The views could be created something like this:

With CurrentProject.Connection

..Execute "CREATE VIEW mSchools WITH SCHEMABINDING AS SELECT
fldSchoolID, fldSchoolName, fldAdministratorPassword,
fldViewerPassword, fldDomain FROM dbo.tblSCHOOLS WHERE fldSchoolName
LIKE 'M%'"

..Execute "SET ARITHABORT ON"

..Execute "CREATE UNIQUE CLUSTERED INDEX mSchoolsID ON mSchools
(fldSchoolID)"

..Execute "CREATE INDEX mSchoolsNames on mSchools (fldSchoolName)"

..Execute "SET ARITHABORT OFF"

End With

and the Select (for "m") would be something like this:

"SELECT * FROM mSchools WHERE fldSchoolName LIKE 'Martin%'")

Would this make things faster? I don't know. It could make the whole
server run noticeably slower, with 52 more indexes to deal with.
Perhaps, someone can tell us what SHOULD happen. Only you can tell us
what does happen.

In any case

With CurrentProject.Connection
..Execute "DROP VIEW mSchools"
End With

should be a quick cure.

Dec 16 '05 #7

P: n/a
Tried it with indexed views and it's definitely slower than just
running the queries against the table itself.
lq

Dec 16 '05 #8

P: n/a
Lauren Quantrell wrote:
Tried it with indexed views and it's definitely slower than just
running the queries against the table itself.
lq


Ouch! So much for that idea ....

--
Lyle Fairfield
Dec 16 '05 #9

P: n/a
Hmm, I would take the basic SQL you've got and play about with filling a
temp table first. I would expect to see a performance improvement with
something like:-

AS
@LENLeftCustName int

SELECT @LENLeftCustName = LEN(@LeftCustomerName )

CREATE TABLE #tCusts(
CustomerID int,
LCustName nvarchar(10),
CustomerName nvarchar(100),
CountryID int,
ProvinceID int,
Province nvarchar(50),
)

INSERT INTO #tCusts(
CustomerID,
CustomerName,
LCustName,
CountryID,
ProvinceID,
Province
)
SELECT
CustomerID,
CustomerName,
LEFT(CustomerName, @LENLeftCustName)
CountryID,
ProvinceID,
Province
FROM
dbo.tblWorldCustomers
WHERE
(CountryID = @CountryID)
AND
(ProvinceID = @ProvinceID OR @ProvinceID Is Null)

CREATE INDEX i_tCusts1 ON #tCusts(LCustName)
CREATE INDEX i_tCusts2 ON #tCusts(CustomerName, Province)

SELECT
CustomerID,
CustomerName,
Province
FROM
#tCusts
WHERE
LCustName = @LeftCustomerName
ORDER BY
CustomerName, Province
--
Terry Kreft

"Lauren Quantrell" <la*************@hotmail.com> wrote in message
news:11**********************@g44g2000cwa.googlegr oups.com...
Tried it with indexed views and it's definitely slower than just
running the queries against the table itself.
lq

Dec 16 '05 #10

P: n/a
Terry,
Thanks for the thought on this.
Populating the temp table means in many cases inserting 100,000 plus
rows, and worse, then creating the indexes, so the performance on this
method was dismal! So in this particular case it's not a workable
solution.

Dec 17 '05 #11

P: n/a
Lauren,
I didn't mean that what I posted was definitive you do have to play around
with how you fill the table and the indexing to see if you can get an
improvement.

I have done this in the real world and seen immense improvements in
performance, but of course it depends on th table structure/ indexing as to
whether an improvement occurs, if it's not working then it's back to the
drawing board.

BTW have you looked at the estimated execution plan in query analyser to see
if that gives any hints on areas where you could improve performance?

As a final note; you do realise that you may end up denormalising and
splitting these records out to multiple tables again and therefore ending up
with "data" being stored as the tables rather than the in the tables don't
you?

Although denormalising is not ideal it is acceptable ,so long as it has been
considered thoroughly as the way to go and so long as it is documented as to
why you went that way.

--
Terry Kreft

"Lauren Quantrell" <la*************@hotmail.com> wrote in message
news:11**********************@g44g2000cwa.googlegr oups.com...
Terry,
Thanks for the thought on this.
Populating the temp table means in many cases inserting 100,000 plus
rows, and worse, then creating the indexes, so the performance on this
method was dismal! So in this particular case it's not a workable
solution.

Dec 17 '05 #12

P: n/a
Terry,
I've been playing with this every which way and it's starting to look
like I might have to split this data back into multiple tables again.
It's fairly logical as each table represents a country.
Thanks.
lq

Dec 17 '05 #13

P: n/a
"Lauren Quantrell" <la*************@hotmail.com> wrote in
news:11**********************@o13g2000cwo.googlegr oups.com:
I've been playing with this every which way and it's starting to
look like I might have to split this data back into multiple
tables again. It's fairly logical as each table represents a
country.


I have some difficulty understanding exactly what it is that you're
trying to do that could cause this kind of performance drag. I have
pure Access apps that deal with hundreds of thousands of records on
a regular basis, and there are no issues. this includes summarizing
data, not just retrieving small sets of records, and perfromance is
acceptable (if not stellar).

I can't quite understand why you should be having such a performance
bottleneck. Really, with server databases, a million records really
oughtn't be a big deal, even if you're summarizing that data.

--
David W. Fenton http://www.dfenton.com/
usenet at dfenton dot com http://www.dfenton.com/DFA/
Dec 17 '05 #14

P: n/a
One thing I have realized is that my SQL Server is running on the same
machine as my Access app in my development environment and this may be
having a major impact on the poor results I'm getting. I have not moved
this massive table to a client server yet - this table is now approx 4
million records. Like you, a couple hundred thousand rows I've never
had a problem with. This is the first time I've dealt with tables in
excess of a million rows, and now pushing 4 million rows and I'm seeing
such a bottleneck running simple select queries to make it unusable.

Dec 18 '05 #15

P: n/a

I have several databases that have more then a million records in
them, and a few more over the years that have had several million
records in them.

The "trick" to this is to transfer as much processing as you can to
the server. Don't select against a table, select against a _view_ or
execute a stored procedure. (More then a few DBAs I know don't even
allow selecting against a table!)

Of course, proper indexing is required. Index the field(s) your
selecting on, and you should see a noticeable performance increase.
On 18 Dec 2005 09:44:11 -0800, "Lauren Quantrell"
<la*************@hotmail.com> wrote:
One thing I have realized is that my SQL Server is running on the same
machine as my Access app in my development environment and this may be
having a major impact on the poor results I'm getting. I have not moved
this massive table to a client server yet - this table is now approx 4
million records. Like you, a couple hundred thousand rows I've never
had a problem with. This is the first time I've dealt with tables in
excess of a million rows, and now pushing 4 million rows and I'm seeing
such a bottleneck running simple select queries to make it unusable.

--
Drive C: Error. (A)bort (R)etry (S)mack The Darned Thing

Dec 19 '05 #16

P: n/a
Lauren,
It may be an idea to ask in comp.databases.ms-sqlserver or one of the MS SQL
groups for some suggestions.
--
Terry Kreft

"Lauren Quantrell" <la*************@hotmail.com> wrote in message
news:11**********************@f14g2000cwb.googlegr oups.com...
One thing I have realized is that my SQL Server is running on the same
machine as my Access app in my development environment and this may be
having a major impact on the poor results I'm getting. I have not moved
this massive table to a client server yet - this table is now approx 4
million records. Like you, a couple hundred thousand rows I've never
had a problem with. This is the first time I've dealt with tables in
excess of a million rows, and now pushing 4 million rows and I'm seeing
such a bottleneck running simple select queries to make it unusable.

Dec 19 '05 #17

P: n/a
Thanks. One of the issues is that any but the most simple select query
times out in Access - trying to run an update query times out every
time.

Dec 19 '05 #18

This discussion thread is closed

Replies have been disabled for this discussion.