By using this site, you agree to our updated Privacy Policy and our Terms of Use. Manage your Cookies Settings.
424,476 Members | 1,390 Online
Bytes IT Community
+ Ask a Question
Need help? Post your question and get tips & solutions from a community of 424,476 IT Pros & Developers. It's quick & easy.

VBA crashes on large table

100+
P: 122
I have a large table (~600,000 records) that needs to have a unique record number assigned to each number. The table itself is too large to add an autonumber field to directly, so I was trying to get around that through VBA.

I have a function that will go through the table's recordset and assign a number to each record. The problem is that it crashes at around 180,000 (and irrecoverably damaged the VBA module - it wouldn't even allow me to delete it.)

In troubleshooting the problem I noticed that as the function runs, the size of the database increases greatly, and the crash is caused by Access hitting the 2gb database limit. I'm wondering if the increased size is due to Access trying to keep everything in one transaction, and if I could tell it to somehow forget about remembering everything and just go ahead and make the changes. Something like a 'commit immediate' mode.

This is the entirety of the VBA module, I call this function directly from the immediate window since it only needs to happen once:

Expand|Select|Wrap|Line Numbers
  1. Function NumberTables(TableName)
  2.   Dim rs As Recordset
  3.   On Error Resume Next
  4.   DoCmd.SetWarnings False
  5.   DoCmd.RunSQL "ALTER TABLE [" & TableName & "] ADD COLUMN [Record_ID] INTEGER;"
  6.   If Err = 3380 Then Exit Function
  7.   On Error GoTo 0
  8.   Set rs = CurrentDb.TableDefs(TableName).OpenRecordset
  9.   rs.MoveFirst
  10.   Do While Not rs.EOF
  11.     Count = Count + 1
  12.     rs.Edit
  13.     rs.Fields("Record_ID") = Count
  14.     rs.Update
  15.     rs.MoveNext
  16.   Loop
  17.   rs.Close
  18. End Function
  19.  
Any idea how I can cause this not to bloat horribly out of control?
Apr 27 '10 #1
Share this Question
Share on Google+
6 Replies


TheSmileyCoder
Expert Mod 100+
P: 2,321
I dont know about any kind of "commit directly" but a few different approaches and questions spring to mind:
Q1) If you do Compact and repair, how large is the database?

Q2) Could you copy the structure into a new table, add the autonumber field and then append the records using standard query?

Q3) When count reaches 100.000 exit the loop, compact & repair, then continue the loop with those records that have not yet been assigned an ID.

Q4) I thought I had a 4...but either I forgot it, or I never had a 4....

Q5) Make a module with a public variable, intCount and a public function fintCounter() that will return intCount, and increment it by one. Then run a normal Update query, where you update the field to Counter(). That will keep you out of VBA recordsets, and may cause less bloating.
Apr 27 '10 #2

100+
P: 122
1- After a compact and repair, the database is 1.32gb.

2- I hadn't thought of this, actually. It seems like a simple solution, but I hesitate to try copying the data over merely because I need the data exactly the way it is, and I worry that some trivial detail will be missed until later.

3- I was thinking of this as a possible solution before I noticed the file size issue. I could rewrite the function so that if the Record_ID fields already exists but contains null values, to pick up where it left off and keep renumbering. And then as you say, terminate the function every 100,000 records or so, and compact.

5- I had tried this first, but I must have written the function incorrectly - it updated every record to the same value. It was after this I went to the recordset option.

Thank you for the suggestions. I will see if I can't get this to work after all. (Though if anyone does know about the filesize thing, I am curious.)
Apr 27 '10 #3

Jim Doherty
Expert 100+
P: 897
@gershwyn
Keeping 600,000 records in one table and appending to another table in the same database? it will equal 1,200,000 rows between them and bang goes the filesize limit.

A) Why not just export the data to delimited text file (test it out on a new database as a (text file) attached table to view the file contents).

B) Copy the structure of the existing table to a new table and as has already been suggested. Delete your existing table only when you are content that your now 'external' data can be imported successfully using a test database

C) Add the autonumber the new table and import your data (Do NOT index anything or else the import will be slow. You can always then add any indexes once the table has been populated)

IMHO opinion using recordsets and loops to 'update' or 'batchupdate' on very large datasets using BeginTrans CommitTrans is slow in my experience. I favour raw SQL wherever possible, most of the time.

As pointed out a different approach will bear fruit here I think

Regards

Jim
Apr 27 '10 #4

ADezii
Expert 5K+
P: 8,597
@gershwyn
  1. I'm surprised that the code runs at all since [Record_ID] is Defined as an INTEGER Data Type whose Maximum Value can only be 32,767. It should be defined as a LONG INTEGER.
  2. Manually add the [Record_ID] Field to your Table, create an Update Query that generates Unique Values in the [Record_ID] Field, and set the Use Transaction Property of the Query to NO which will force the Query NOT to run as a Single Transaction. This is only Theory and has not actually been tested.
Apr 28 '10 #5

100+
P: 122
@ADezii
...

I can't believe I didn't notice that. The code "runs" fine, but is throwing error after error, though my setwarnings command is suppressing them. (I also just noticed I never turned them back on again - oops.)

Defining the field as a Long Int solved the problem. I suspect the file size issue was Access compiling a list of the thousands of errors that were occurring. I don't how that works with updating individual records in a recordset, but if this were a single update query it would give me the option of continuing or canceling, which means it must save the old values somewhere.

This whole thing was meant to be quick and dirty and I only succeeded at the dirty part.

Your second suggestion probably would have saved a lot of time if I could get it to work. The following function results in updating everything to 1. Running it again changes everything to 2. What do I need to do to make it increment correctly?

Expand|Select|Wrap|Line Numbers
  1. Option Explicit
  2. Public Counter As Long
  3.  
  4. Function ReturnCount()
  5.   Counter = Counter + 1
  6.   ReturnCount = Counter
  7. End Function
Apr 28 '10 #6

ADezii
Expert 5K+
P: 8,597
@gershwyn
Does the Unique ID have to be Numeric, and if it does, does it have to be sequential?
Apr 28 '10 #7

Post your reply

Sign in to post your reply or Sign up for a free account.