By using this site, you agree to our updated Privacy Policy and our Terms of Use. Manage your Cookies Settings.
424,688 Members | 1,888 Online
Bytes IT Community
+ Ask a Question
Need help? Post your question and get tips & solutions from a community of 424,688 IT Pros & Developers. It's quick & easy.

Access 2003 and SQL Server 2000 record locks (long)

P: n/a
Hello everybody,

I've got a bit of a situation that I could use some guidance with if
possible. I work for an auditing firm where my users audit electronic
shipping data for customers to see if they've been overcharged for
shipments by truck and rail carriers.

99.9% of the time, one of our auditors needs to see all data sent by a
customer in datasheet view so that they can sort records in many
different ways looking for duplication of bill payments over a
specific time period. This might involve as little as 100 records for
a small client or 200,000 or more records for a larger client. I know
this sounds ludicrous to have that many records open at one time in a
form, but you would not believe what our auditors have found at times
and thus have recovered many thousands of dollars for our clients just
because they can see the whole picture of charges in one screen full
of data. The nature of our business basically violates every database
rule there is about only retrieving small datasets and disconnecting
the recordset.

The problem I'm having at this using Access 97, 2000 or 2003 as the
frontend and SQL Server 2000 as the backend...record locks that
escalate into full table locks / deadlocks if a user doesn't goto the
end of a recordset with a query/form open so that Access releases the
locks. Sorting of a recordset in a form causes entire the entire
query to be re-run thus pulling the entire recordset back over the
network to the machine and then creating major locking problems when
multiple users are sorting and updating rows with data to print
claims.

Yes, I've been using bound forms for a very long time and have enjoyed
the freedom of not having to marshal connections and record changes,
but that freedom is coming to an end I believe, according to what you
guys are going to say. I can see the writing on the wall about this,
but need some guidance of how to make my app perform better.

I have tried the following to fix the record locking and performance
problems using Access 2003:

1. Using an .MDB, create a client-side ADO recordset from a query,
disconnecting the recordset from the database and then assigning
the .Recordset property of the form to this ADO recordset.
Performance was horrible even with just 10,000 records. There were
other concurrency issues to that I never got around also when trying
to update the ADO recordset and flushing the data back to the
database.

2. Tried using an ADP with stored procedures but the OLEDB driver
just goes nuts and eats all of the RAM in the machine when working
with large recordsets. I don't think an ADP will be the way I need to
go because the users change queries all the time looking for anomolies
in the data to file a claim against. I know that the auditors don't
want to learn SQL to change stored procs!

I've thought of trying the following, what do you all think?

1. Create a local table that contains all of the fields that the
auditor will need to do their thing.
2. Execute an INSERT into this local table from the SQL server with
an append query.
3. Base the auditing form on this local table so that the performance
is good and write changes back to the database with a stored proc call
passing the fields that were changed.

If there's some other way to do this with some sort of disconnected
recordset and binding that recordset to a form, I'm ALL EARS!

My flame suit is on for using bound forms, but they've worked like a
charm in the past before moving up to SQL server because we needed
much more database space than an Access .MDB could hold!

Thanks,
Jim
Nov 29 '07 #1
Share this Question
Share on Google+
6 Replies


P: n/a
Hi Jim,

(disclaimer incase some decides to flame me for making the following
statements, but here are the facts)

Access is a micro database system and will function effectively for
micro operations -- even on the web. But for large scale operations
like yours a micro rdbms just isn't going to cut it as you are seeing.
You already have a macro back end (sql server). So you need to step up
to a macro system to overcome your issues. That would be a .Net front
end which can take advantage of ADO.Net.

ADO.Net was specifically designed to handle concurrency/deadlocking
issues and works like a champ. Performance is phenomenal because .Net
has reduced I/O to the minimum required amount. Classic ADO writes to
the disk for every operation (but still better than ODBC which has teeny
bandwidth compared to ADO and keeps a constant connection open). When
you pull data with classic ADO - it goes to a table on the disk.
ADO.Net pulls data into tables in memory. This is where you get
significant performance. And another improvement is that .Net has
reduce data looping to the minimum required (like for arrays). With
ADO.Net you Fill a dataset - no looping required. And, of course, you
have way more bandwidth.

I have users that need to scroll through hundreds of thousands of
records also. I feel your pain :). I still love Access for the kinds
of operations it was designed for. But for large scale stuff - nothing
beats .Net (unless you need opensource - then Java - but .Net still has
more performance than Java).

Rich

*** Sent via Developersdex http://www.developersdex.com ***
Nov 29 '07 #2

P: n/a
On Nov 29, 2:08 pm, Rich P <rpng...@aol.comwrote:
Hi Jim,

(disclaimer incase some decides to flame me for making the following
statements, but here are the facts)

Access is a micro database system and will function effectively for
micro operations -- even on the web. But for large scale operations
like yours a micro rdbms just isn't going to cut it as you are seeing.
You already have a macro back end (sql server). So you need to step up
to a macro system to overcome your issues. That would be a .Net front
end which can take advantage of ADO.Net.

ADO.Net was specifically designed to handle concurrency/deadlocking
issues and works like a champ. Performance is phenomenal because .Net
has reduced I/O to the minimum required amount. Classic ADO writes to
the disk for every operation (but still better than ODBC which has teeny
bandwidth compared to ADO and keeps a constant connection open). When
you pull data with classic ADO - it goes to a table on the disk.
ADO.Net pulls data into tables in memory. This is where you get
significant performance. And another improvement is that .Net has
reduce data looping to the minimum required (like for arrays). With
ADO.Net you Fill a dataset - no looping required. And, of course, you
have way more bandwidth.

I have users that need to scroll through hundreds of thousands of
records also. I feel your pain :). I still love Access for the kinds
of operations it was designed for. But for large scale stuff - nothing
beats .Net (unless you need opensource - then Java - but .Net still has
more performance than Java).

Rich

*** Sent via Developersdexhttp://www.developersdex.com***
Hey Rich,

Yes I do see what you mean by using a BB-gun where a .300 magnum is
required! On .net forms where your users load up 100's of thousands
of records, do you use the normal grid controls that come with VS.NET
or have you bought some 3rd party controls that deal with the volume
better? I tried messing with a normal grid control this morning and
the performance was not very stellar. I'm new to VS.NET and need any
and all guidance that anyone can give while moving to a larger
toolset.

Thanks,
Jim
Nov 29 '07 #3

P: n/a
Hi Jim,

I use the standard datagridview that comes with VS2005. Here is the
catch though, VS2005 works best with a 2.8 gig processor or higher and
at least 2 gigs of memory. I have had hardware issues at my place for
people that are still using 1.8 gig processors and 1 gig of mem. The
memory at 1 gig isn't actually that bad (if they aren't loading hundreds
of thousands of records), but the 1.8 gig processor does hurt a little
bit. But if you are going to load 200,000 records into a table -
remember that in .Net it is all in memory. So 2 gigs definitley better
than 1 gig.

.Net is fairly resource intensive, but if your hardware has enough
horsepower, you won't notice it (big IF -- I know).

Rich

*** Sent via Developersdex http://www.developersdex.com ***
Nov 29 '07 #4

P: n/a
On Nov 29, 3:11 pm, Rich P <rpng...@aol.comwrote:
Hi Jim,

I use the standard datagridview that comes with VS2005. Here is the
catch though, VS2005 works best with a 2.8 gig processor or higher and
at least 2 gigs of memory. I have had hardware issues at my place for
people that are still using 1.8 gig processors and 1 gig of mem. The
memory at 1 gig isn't actually that bad (if they aren't loading hundreds
of thousands of records), but the 1.8 gig processor does hurt a little
bit. But if you are going to load 200,000 records into a table -
remember that in .Net it is all in memory. So 2 gigs definitley better
than 1 gig.

.Net is fairly resource intensive, but if your hardware has enough
horsepower, you won't notice it (big IF -- I know).

Rich

*** Sent via Developersdexhttp://www.developersdex.com***
Thanks Rich, now it's off to the store for some training material on
VS.NET 2005!

Jim
Nov 29 '07 #5

P: n/a
You will love it. I still deal with Access because I support hordes of
people who are still using Access, but for enterprise operations - .Net
all the way.

Rich

*** Sent via Developersdex http://www.developersdex.com ***
Nov 29 '07 #6

P: n/a
On Nov 29, 4:15 pm, jsac...@comcast.net wrote:
On Nov 29, 3:11 pm, Rich P <rpng...@aol.comwrote:
Hi Jim,
I use the standard datagridview that comes with VS2005. Here is the
catch though, VS2005 works best with a 2.8 gig processor or higher and
at least 2 gigs of memory. I have had hardware issues at my place for
people that are still using 1.8 gig processors and 1 gig of mem. The
memory at 1 gig isn't actually that bad (if they aren't loading hundreds
of thousands of records), but the 1.8 gig processor does hurt a little
bit. But if you are going to load 200,000 records into a table -
remember that in .Net it is all in memory. So 2 gigs definitley better
than 1 gig.
.Net is fairly resource intensive, but if your hardware has enough
horsepower, you won't notice it (big IF -- I know).
Rich
*** Sent via Developersdexhttp://www.developersdex.com***

Thanks Rich, now it's off to the store for some training material on
VS.NET 2005!

Jim
Great!

Have you researched this carefully?

I have an ASP file that shows twenty records. It appears
instantaneously.
I have an ASPX file on the same site that shows the same records. It
appears in five to ten seconds.

Now it's quite possible that I haven't optimized my ADO.Net and ASPX
to the extent I've optimized my ADO and ASP. I've been using the first
for two years or so and the last for six or seven years.
Then again I have been programming for more than twenty years so it's
always possible that I have optimized them both.

Nov 30 '07 #7

This discussion thread is closed

Replies have been disabled for this discussion.