473,883 Members | 1,593 Online
Bytes | Software Development & Data Engineering Community
+ Post

Home Posts Topics Members FAQ

Access 2003 and SQL Server 2000 record locks (long)

Hello everybody,

I've got a bit of a situation that I could use some guidance with if
possible. I work for an auditing firm where my users audit electronic
shipping data for customers to see if they've been overcharged for
shipments by truck and rail carriers.

99.9% of the time, one of our auditors needs to see all data sent by a
customer in datasheet view so that they can sort records in many
different ways looking for duplication of bill payments over a
specific time period. This might involve as little as 100 records for
a small client or 200,000 or more records for a larger client. I know
this sounds ludicrous to have that many records open at one time in a
form, but you would not believe what our auditors have found at times
and thus have recovered many thousands of dollars for our clients just
because they can see the whole picture of charges in one screen full
of data. The nature of our business basically violates every database
rule there is about only retrieving small datasets and disconnecting
the recordset.

The problem I'm having at this using Access 97, 2000 or 2003 as the
frontend and SQL Server 2000 as the backend...recor d locks that
escalate into full table locks / deadlocks if a user doesn't goto the
end of a recordset with a query/form open so that Access releases the
locks. Sorting of a recordset in a form causes entire the entire
query to be re-run thus pulling the entire recordset back over the
network to the machine and then creating major locking problems when
multiple users are sorting and updating rows with data to print
claims.

Yes, I've been using bound forms for a very long time and have enjoyed
the freedom of not having to marshal connections and record changes,
but that freedom is coming to an end I believe, according to what you
guys are going to say. I can see the writing on the wall about this,
but need some guidance of how to make my app perform better.

I have tried the following to fix the record locking and performance
problems using Access 2003:

1. Using an .MDB, create a client-side ADO recordset from a query,
disconnecting the recordset from the database and then assigning
the .Recordset property of the form to this ADO recordset.
Performance was horrible even with just 10,000 records. There were
other concurrency issues to that I never got around also when trying
to update the ADO recordset and flushing the data back to the
database.

2. Tried using an ADP with stored procedures but the OLEDB driver
just goes nuts and eats all of the RAM in the machine when working
with large recordsets. I don't think an ADP will be the way I need to
go because the users change queries all the time looking for anomolies
in the data to file a claim against. I know that the auditors don't
want to learn SQL to change stored procs!

I've thought of trying the following, what do you all think?

1. Create a local table that contains all of the fields that the
auditor will need to do their thing.
2. Execute an INSERT into this local table from the SQL server with
an append query.
3. Base the auditing form on this local table so that the performance
is good and write changes back to the database with a stored proc call
passing the fields that were changed.

If there's some other way to do this with some sort of disconnected
recordset and binding that recordset to a form, I'm ALL EARS!

My flame suit is on for using bound forms, but they've worked like a
charm in the past before moving up to SQL server because we needed
much more database space than an Access .MDB could hold!

Thanks,
Jim
Nov 29 '07 #1
6 2680
Hi Jim,

(disclaimer incase some decides to flame me for making the following
statements, but here are the facts)

Access is a micro database system and will function effectively for
micro operations -- even on the web. But for large scale operations
like yours a micro rdbms just isn't going to cut it as you are seeing.
You already have a macro back end (sql server). So you need to step up
to a macro system to overcome your issues. That would be a .Net front
end which can take advantage of ADO.Net.

ADO.Net was specifically designed to handle concurrency/deadlocking
issues and works like a champ. Performance is phenomenal because .Net
has reduced I/O to the minimum required amount. Classic ADO writes to
the disk for every operation (but still better than ODBC which has teeny
bandwidth compared to ADO and keeps a constant connection open). When
you pull data with classic ADO - it goes to a table on the disk.
ADO.Net pulls data into tables in memory. This is where you get
significant performance. And another improvement is that .Net has
reduce data looping to the minimum required (like for arrays). With
ADO.Net you Fill a dataset - no looping required. And, of course, you
have way more bandwidth.

I have users that need to scroll through hundreds of thousands of
records also. I feel your pain :). I still love Access for the kinds
of operations it was designed for. But for large scale stuff - nothing
beats .Net (unless you need opensource - then Java - but .Net still has
more performance than Java).

Rich

*** Sent via Developersdex http://www.developersdex.com ***
Nov 29 '07 #2
On Nov 29, 2:08 pm, Rich P <rpng...@aol.co mwrote:
Hi Jim,

(disclaimer incase some decides to flame me for making the following
statements, but here are the facts)

Access is a micro database system and will function effectively for
micro operations -- even on the web. But for large scale operations
like yours a micro rdbms just isn't going to cut it as you are seeing.
You already have a macro back end (sql server). So you need to step up
to a macro system to overcome your issues. That would be a .Net front
end which can take advantage of ADO.Net.

ADO.Net was specifically designed to handle concurrency/deadlocking
issues and works like a champ. Performance is phenomenal because .Net
has reduced I/O to the minimum required amount. Classic ADO writes to
the disk for every operation (but still better than ODBC which has teeny
bandwidth compared to ADO and keeps a constant connection open). When
you pull data with classic ADO - it goes to a table on the disk.
ADO.Net pulls data into tables in memory. This is where you get
significant performance. And another improvement is that .Net has
reduce data looping to the minimum required (like for arrays). With
ADO.Net you Fill a dataset - no looping required. And, of course, you
have way more bandwidth.

I have users that need to scroll through hundreds of thousands of
records also. I feel your pain :). I still love Access for the kinds
of operations it was designed for. But for large scale stuff - nothing
beats .Net (unless you need opensource - then Java - but .Net still has
more performance than Java).

Rich

*** Sent via Developersdexht tp://www.developersd ex.com***
Hey Rich,

Yes I do see what you mean by using a BB-gun where a .300 magnum is
required! On .net forms where your users load up 100's of thousands
of records, do you use the normal grid controls that come with VS.NET
or have you bought some 3rd party controls that deal with the volume
better? I tried messing with a normal grid control this morning and
the performance was not very stellar. I'm new to VS.NET and need any
and all guidance that anyone can give while moving to a larger
toolset.

Thanks,
Jim
Nov 29 '07 #3
Hi Jim,

I use the standard datagridview that comes with VS2005. Here is the
catch though, VS2005 works best with a 2.8 gig processor or higher and
at least 2 gigs of memory. I have had hardware issues at my place for
people that are still using 1.8 gig processors and 1 gig of mem. The
memory at 1 gig isn't actually that bad (if they aren't loading hundreds
of thousands of records), but the 1.8 gig processor does hurt a little
bit. But if you are going to load 200,000 records into a table -
remember that in .Net it is all in memory. So 2 gigs definitley better
than 1 gig.

.Net is fairly resource intensive, but if your hardware has enough
horsepower, you won't notice it (big IF -- I know).

Rich

*** Sent via Developersdex http://www.developersdex.com ***
Nov 29 '07 #4
On Nov 29, 3:11 pm, Rich P <rpng...@aol.co mwrote:
Hi Jim,

I use the standard datagridview that comes with VS2005. Here is the
catch though, VS2005 works best with a 2.8 gig processor or higher and
at least 2 gigs of memory. I have had hardware issues at my place for
people that are still using 1.8 gig processors and 1 gig of mem. The
memory at 1 gig isn't actually that bad (if they aren't loading hundreds
of thousands of records), but the 1.8 gig processor does hurt a little
bit. But if you are going to load 200,000 records into a table -
remember that in .Net it is all in memory. So 2 gigs definitley better
than 1 gig.

.Net is fairly resource intensive, but if your hardware has enough
horsepower, you won't notice it (big IF -- I know).

Rich

*** Sent via Developersdexht tp://www.developersd ex.com***
Thanks Rich, now it's off to the store for some training material on
VS.NET 2005!

Jim
Nov 29 '07 #5
You will love it. I still deal with Access because I support hordes of
people who are still using Access, but for enterprise operations - .Net
all the way.

Rich

*** Sent via Developersdex http://www.developersdex.com ***
Nov 29 '07 #6
On Nov 29, 4:15 pm, jsac...@comcast .net wrote:
On Nov 29, 3:11 pm, Rich P <rpng...@aol.co mwrote:
Hi Jim,
I use the standard datagridview that comes with VS2005. Here is the
catch though, VS2005 works best with a 2.8 gig processor or higher and
at least 2 gigs of memory. I have had hardware issues at my place for
people that are still using 1.8 gig processors and 1 gig of mem. The
memory at 1 gig isn't actually that bad (if they aren't loading hundreds
of thousands of records), but the 1.8 gig processor does hurt a little
bit. But if you are going to load 200,000 records into a table -
remember that in .Net it is all in memory. So 2 gigs definitley better
than 1 gig.
.Net is fairly resource intensive, but if your hardware has enough
horsepower, you won't notice it (big IF -- I know).
Rich
*** Sent via Developersdexht tp://www.developersd ex.com***

Thanks Rich, now it's off to the store for some training material on
VS.NET 2005!

Jim
Great!

Have you researched this carefully?

I have an ASP file that shows twenty records. It appears
instantaneously .
I have an ASPX file on the same site that shows the same records. It
appears in five to ten seconds.

Now it's quite possible that I haven't optimized my ADO.Net and ASPX
to the extent I've optimized my ADO and ASP. I've been using the first
for two years or so and the last for six or seven years.
Then again I have been programming for more than twenty years so it's
always possible that I have optimized them both.

Nov 30 '07 #7

This thread has been closed and replies have been disabled. Please start a new discussion.

Similar topics

14
5431
by: Sean C. | last post by:
Helpful folks, Most of my previous experience with DB2 was on s390 mainframe systems and the optimizer on this platform always seemed very predictable and consistent. Since moving to a WinNT/UDB 7.2 environment, the choices the optimizer makes often seem flaky. But this last example really floored me. I was hoping someone could explain why I get worse response time when the optimizer uses two indexes, than when it uses one. Some context:
5
1795
by: Praty77 | last post by:
Hello - we have been using access 97 for a multiple user trading system at a small bank. Typically around 10 users entered a total of hundred trades everyday. Some of the data was shared with a MS SQL 2000 server, which we accessed (both for read and write) as a linked ODBC table. Over the period of time, the system has become quite complex, with 30 tables, 30 forms and a size of 140 M. We have been facing a number of problems during...
8
4876
by: Thats Me | last post by:
Background: Access 2000 running on Windows 2000, Did not design inherited (three previous database maintainers in last 18 months), Non-existent comments for existing code modules and objects, six users on LAN, BE on shared server. Each user has own FE to allow special report & query creation, each user has common forms for input and edit of main table data, Main data tables are linked from BE, special tables are stored in users FE. ...
6
3674
by: Terry Bell | last post by:
We've had a very large A97 app running fine for the last seven years. I've just converted to SQL Server backend, which is being tested, but meanwhile the JET based version, running under terminal server, has suddenly started running very slowly. The network engineer has thrown up his hands and said "It's Access 97". I've checked out lots of things including the Oplocks setting and other stuff from this NG, and I think I've done everything...
3
1631
by: JasonRMerrill | last post by:
I've got a system that's been corrupting repeatedly recently and I'm having trouble tracking down the cause. I've done as much reading as I can so far, but am starting to get lost as most of the common causes/solutions don't seem to apply. Particulars: Access 97 recently converted to 2003 (started life in 2.0). Corruption did start after the 2003 upgrade, which is in prep for converting to SQL Server.
3
2794
by: RayPower | last post by:
I'm having a system using Access 2000 as both front-end (queries, forms, reports & temp tables for reports) & back-end (data) with back-end running on the server. The application runs on the network with around 15 users. The back-end database is suffering from performance issues. There were several corruptions since few months & one major corruption & corrupted the MSysObjects table. Also, the system will run batch process that generate...
49
3260
by: Mell via AccessMonster.com | last post by:
I created databases on Access 2003 and I want to deploy them to users. My code was also done using 2003. If they have Ms Access 2000 or higher, will they be able to use these dbs with all code, etc? Please explain -- Message posted via http://www.accessmonster.com
0
3713
by: Murali | last post by:
In MsAccess project I Create the Link table with Sqlserver 2000. I am facing the multiuser Problem in my application. The problem describes below: If one user is updating one record and at that same time another user try to modify the same record. Now current application is allowing the second user to modify the same record. Before going to update the record, I gave the select Query to Select the Particular record with the type as...
4
2355
by: ThePhenix | last post by:
Hi everybody, I have recently been doing a conversion for my boss for a access 2000 database (both front and backend) to a MSSQL backend. I finished the conversion today (as quite a lot of the code / queries ran slow due to access running the queries locally rather than on the server). And tested it on my and my boss's machine with no problems so he gave the go ahead to update everybody to our new mssql 2000 backend with the modified...
0
9943
marktang
by: marktang | last post by:
ONU (Optical Network Unit) is one of the key components for providing high-speed Internet services. Its primary function is to act as an endpoint device located at the user's premises. However, people are often confused as to whether an ONU can Work As a Router. In this blog post, we’ll explore What is ONU, What Is Router, ONU & Router’s main usage, and What is the difference between ONU and Router. Let’s take a closer look ! Part I. Meaning of...
0
9793
by: Hystou | last post by:
Most computers default to English, but sometimes we require a different language, especially when relocating. Forgot to request a specific language before your computer shipped? No problem! You can effortlessly switch the default language on Windows 10 without reinstalling. I'll walk you through it. First, let's disable language synchronization. With a Microsoft account, language settings sync across devices. To prevent any complications,...
0
11151
Oralloy
by: Oralloy | last post by:
Hello folks, I am unable to find appropriate documentation on the type promotion of bit-fields when using the generalised comparison operator "<=>". The problem is that using the GNU compilers, it seems that the internal comparison operator "<=>" tries to promote arguments from unsigned to signed. This is as boiled down as I can make it. Here is my compilation command: g++-12 -std=c++20 -Wnarrowing bit_field.cpp Here is the code in...
1
10858
by: Hystou | last post by:
Overview: Windows 11 and 10 have less user interface control over operating system update behaviour than previous versions of Windows. In Windows 11 and 10, there is no way to turn off the Windows Update option using the Control Panel or Settings app; it automatically checks for updates and installs any it finds, whether you like it or not. For most users, this new feature is actually very convenient. If you want to control the update process,...
1
7974
isladogs
by: isladogs | last post by:
The next Access Europe User Group meeting will be on Wednesday 1 May 2024 starting at 18:00 UK time (6PM UTC+1) and finishing by 19:30 (7.30PM). In this session, we are pleased to welcome a new presenter, Adolph Dupré who will be discussing some powerful techniques for using class modules. He will explain when you may want to use classes instead of User Defined Types (UDT). For example, to manage the data in unbound forms. Adolph will...
0
7134
by: conductexam | last post by:
I have .net C# application in which I am extracting data from word file and save it in database particularly. To store word all data as it is I am converting the whole word file firstly in HTML and then checking html paragraph one by one. At the time of converting from word file to html my equations which are in the word document file was convert into image. Globals.ThisAddIn.Application.ActiveDocument.Select();...
0
5996
by: adsilva | last post by:
A Windows Forms form does not have the event Unload, like VB6. What one acts like?
1
4619
by: 6302768590 | last post by:
Hai team i want code for transfer the data from one system to another through IP address by using C# our system has to for every 5mins then we have to update the data what the data is updated we have to send another system
3
3237
bsmnconsultancy
by: bsmnconsultancy | last post by:
In today's digital era, a well-designed website is crucial for businesses looking to succeed. Whether you're a small business owner or a large corporation in Toronto, having a strong online presence can significantly impact your brand's success. BSMN Consultancy, a leader in Website Development in Toronto offers valuable insights into creating effective websites that not only look great but also perform exceptionally well. In this comprehensive...

By using Bytes.com and it's services, you agree to our Privacy Policy and Terms of Use.

To disable or enable advertisements and analytics tracking please visit the manage ads & tracking page.