By using this site, you agree to our updated Privacy Policy and our Terms of Use. Manage your Cookies Settings.
459,633 Members | 1,930 Online
Bytes IT Community
+ Ask a Question
Need help? Post your question and get tips & solutions from a community of 459,633 IT Pros & Developers. It's quick & easy.

Export fast, my code slow =[

P: n/a
I'm doing some speed tests. I created a brand-new table, with just
one "Long Integer" field. I'm testing adding 1000 records. If I use
the "Export" feature in Access, it takes only a few seconds to export
the 1000 records to a new table on the server. If I use my code
below, it takes 100 secods (10 records per second). Can someone help
me speed up my code?

--- code ---

Dim con As New ADODB.Connection

con.Open "Provider=MSDASQL.1;DRIVER={MySQL ODBC 3.51
Driver};SERVER=www.mydomain.com;PORT=3306;OPTION=3 ;DATABASE=mydbname;UID=myuid;PWD=mypwd;"

'con.BeginTrans
Dim rs As New ADODB.Recordset
rs.Open "Table1", con, adOpenKeyset, adLockOptimistic

' This loop takes ~100 seconds
For x = 1 To 1000
rs.AddNew
rs.Fields("ID").Value = x
rs.Update
Next x

rs.Close
'con.CommitTrans
con.Close

---- end code ----

Please note I had to comment out the BeginTrans and CommitTrans
because I was getting an error at CommitTrans saying that "No
Transaction has been started". I think this is because either the
ODBC driver (MyODBC) or the server (MySQL, with MyISAM) doesn't
support transactions (probably because I'm using MyISAM data format,
with a 3.x version of MySQL server).

At any rate, export is fast, so there must be a way to speed up adding
records. I can't just use export because I actually need more control
than just dumping records over, but for now I'm just trying to find
the fastest method for adding several records.

I think the above code is sending an "INSERT INTO" SQL statement to
the server for each record I add. Is there another method I can use,
without hand-coding the SQL myself, that would send all 1000 adds as a
single INSERT INTO statement? MySQL does support multiple rows in the
VALUES() part of the INSERT INTO statement. If possible I'd like to
stick with code that also works with MSSQL, or would work with only
minor changes... which is one reason I haven't tried any native MySQL
APIs.

The reason I posted this to the MSAccess group instead of MySQL/MyODBC
is to see if anyone knows what method MSAccess uses for Export, why is
it so much faster? And also because my code is in VBA. I'll probably
setup netmon.. maybe I can see the actual SQL statements being sent by
each method.

Jun 29 '07 #1
Share this Question
Share on Google+
7 Replies


P: n/a
Yes, that is correct: your code is taking much longer because it's appending
the records one at a time, instead of in batch. Use Insert Into to insert
your tables in batch into the target table. In its simplest incarnation, it
would be:

INSERT INTO Table2 ( FieldSomething )
SELECT Table1.FieldWhatever
FROM Table1

You can also add a Where statement, or append multiple fields, or whatever
you need.

Neil

<es***@surfbest.netwrote in message
news:11*********************@o61g2000hsh.googlegro ups.com...
I'm doing some speed tests. I created a brand-new table, with just
one "Long Integer" field. I'm testing adding 1000 records. If I use
the "Export" feature in Access, it takes only a few seconds to export
the 1000 records to a new table on the server. If I use my code
below, it takes 100 secods (10 records per second). Can someone help
me speed up my code?

--- code ---

Dim con As New ADODB.Connection

con.Open "Provider=MSDASQL.1;DRIVER={MySQL ODBC 3.51
Driver};SERVER=www.mydomain.com;PORT=3306;OPTION=3 ;DATABASE=mydbname;UID=myuid;PWD=mypwd;"

'con.BeginTrans
Dim rs As New ADODB.Recordset
rs.Open "Table1", con, adOpenKeyset, adLockOptimistic

' This loop takes ~100 seconds
For x = 1 To 1000
rs.AddNew
rs.Fields("ID").Value = x
rs.Update
Next x

rs.Close
'con.CommitTrans
con.Close

---- end code ----

Please note I had to comment out the BeginTrans and CommitTrans
because I was getting an error at CommitTrans saying that "No
Transaction has been started". I think this is because either the
ODBC driver (MyODBC) or the server (MySQL, with MyISAM) doesn't
support transactions (probably because I'm using MyISAM data format,
with a 3.x version of MySQL server).

At any rate, export is fast, so there must be a way to speed up adding
records. I can't just use export because I actually need more control
than just dumping records over, but for now I'm just trying to find
the fastest method for adding several records.

I think the above code is sending an "INSERT INTO" SQL statement to
the server for each record I add. Is there another method I can use,
without hand-coding the SQL myself, that would send all 1000 adds as a
single INSERT INTO statement? MySQL does support multiple rows in the
VALUES() part of the INSERT INTO statement. If possible I'd like to
stick with code that also works with MSSQL, or would work with only
minor changes... which is one reason I haven't tried any native MySQL
APIs.

The reason I posted this to the MSAccess group instead of MySQL/MyODBC
is to see if anyone knows what method MSAccess uses for Export, why is
it so much faster? And also because my code is in VBA. I'll probably
setup netmon.. maybe I can see the actual SQL statements being sent by
each method.

Jun 29 '07 #2

P: n/a
If you are exporting data to a server DB -- Well for MS sql server --
you have a few options - note: you can't do a straight forward Insert
Into from an External DB to a server DB - you have to use a Linked
Server for this:

one option is to use the OpenRowset method once you have established a
linked server. The only caveat is that the Access mdb has to reside on
the same server computer as the Sql server. Then you can use Insert
Into.

The other option, which is more versatile, is to use VB.Net (VB2005).
VB2005 has some very nice features for reading bulk data from one
external database like Access to a Server Database (sql server, mysql,
oracle...). No looping involved, use DataAdapters.

A third option is to export your data to a text file using
DoCmd.TransferText... and then read the text file into the Server DB
using Bulk Insert

bulk insert serverTable
from 'C:\1a\testdata.txt'
with
(
fieldterminator = '|',
RowTerminator = '|\n'
)

You might be able to use an ADODB.Command object to run the bulk Insert.
Haven't tried it from ADODB.

cmd.CommandText = "bulk insert serverTable " _
& "from 'C:\1a\testdata.txt' " _
& "with (fieldterminator = '|', RowTerminator = '|\n')"

But you can run this code from VB2005. You can definitely run it from
Sql Server Query Analyzer. This might be your easiest bet (if you are
using MS Sql Server). Don't know how it might work with MySql. Oh yeah,
you have to create the server table first so you have something to
insert into.

Rich

*** Sent via Developersdex http://www.developersdex.com ***
Jun 29 '07 #3

P: n/a
I disagree. If he's using ODBC linked tables, then he can indeed simply do
an Insert Into from Access to his server db without using a linked server.
He can link from one db and append to links to another server db, whether or
not the two dbs are on the same server computer; or he can append native Jet
data to his server db.

Neil

"Rich P" <rp*****@aol.comwrote in message
news:46*********************@news.qwest.net...
If you are exporting data to a server DB -- Well for MS sql server --
you have a few options - note: you can't do a straight forward Insert
Into from an External DB to a server DB - you have to use a Linked
Server for this:

one option is to use the OpenRowset method once you have established a
linked server. The only caveat is that the Access mdb has to reside on
the same server computer as the Sql server. Then you can use Insert
Into.

Jun 29 '07 #4

P: n/a
>>
I disagree. If he's using ODBC linked tables, then he can indeed simply
do an Insert Into from Access to his server db without using a linked
server. He can link from one db and append to links to another server
db, whether or
not the two dbs are on the same server computer; or he can append native
Jet data to his server db.
<<

This is correct. However, the original post was using ADODB. I
suggested methods for ADODB.

Also, I have a correction: the Bulk Insert only works on the server
computer, so the text file would have to be exported to the server
computer and the bulk insert run from the server computer.

The only option for inserting rows to a Remote Server - remotely - using
ADO (that is - ADODB) without looping is to use ADO.Net in VB.Net or C#
(very fast).

Yes, you can also use Insert Into with ODBC, but I don't think that will
be much faster than looping through an ADODB recordset. It might be,
but neither comes close to the performance you get with ADO.Net.

Rich

*** Sent via Developersdex http://www.developersdex.com ***
Jun 29 '07 #5

P: n/a

"Rich P" <rp*****@aol.comwrote in message
news:46***********************@news.qwest.net...
>>>
I disagree. If he's using ODBC linked tables, then he can indeed simply
do an Insert Into from Access to his server db without using a linked
server. He can link from one db and append to links to another server
db, whether or
not the two dbs are on the same server computer; or he can append native
Jet data to his server db.
<<

This is correct. However, the original post was using ADODB. I
suggested methods for ADODB.
I realized that he was using ADO in his code; but that didn't negate the
possibility of his using ADO within an MDB file. Since he had mentioned the
"ODBC driver (MyODBC)", I assumed that he also had linked ODBC tables, even
though his ADO code wasn't using them.

Yes, you can also use Insert Into with ODBC, but I don't think that will
be much faster than looping through an ADODB recordset. It might be,
but neither comes close to the performance you get with ADO.Net.
If he's just doing a bulk insert, I think it'll be much faster. If his
insert uses subqueries, or a heterogeneous join, etc., it probably would be
slower.

Neil
Jun 29 '07 #6

P: n/a
On Jun 29, 11:36 am, "Neil" <nos...@nospam.netwrote:
Yes, that is correct: your code is taking much longer because it's appending
the records one at a time, instead of in batch. Use Insert Into to insert
your tables in batch into the target table. In its simplest incarnation, it
would be:

INSERT INTO Table2 ( FieldSomething )
SELECT Table1.FieldWhatever
FROM Table1

You can also add a Where statement, or append multiple fields, or whatever
you need.

Neil
Thanks. I was thinking that would only work if both tables where on
the server, and my problem is getting the data up to the server in the
first place... but I guess that might also work using linked tables.

Also, thanks to all others who replied. Now I have lots of options to
explore! =]

Jun 29 '07 #7

P: n/a
Yes, linked tables make it very easy. But, as Rich pointed out, there might
be some performance hits. But I think if your query is pretty
straightforward, you should get pretty good performance.

Neil

<es***@surfbest.netwrote in message
news:11*********************@m36g2000hse.googlegro ups.com...
On Jun 29, 11:36 am, "Neil" <nos...@nospam.netwrote:
>Yes, that is correct: your code is taking much longer because it's
appending
the records one at a time, instead of in batch. Use Insert Into to insert
your tables in batch into the target table. In its simplest incarnation,
it
would be:

INSERT INTO Table2 ( FieldSomething )
SELECT Table1.FieldWhatever
FROM Table1

You can also add a Where statement, or append multiple fields, or
whatever
you need.

Neil

Thanks. I was thinking that would only work if both tables where on
the server, and my problem is getting the data up to the server in the
first place... but I guess that might also work using linked tables.

Also, thanks to all others who replied. Now I have lots of options to
explore! =]

Jun 29 '07 #8

This discussion thread is closed

Replies have been disabled for this discussion.