473,378 Members | 1,439 Online
Bytes | Software Development & Data Engineering Community
Post Job

Home Posts Topics Members FAQ

Join Bytes to post your question to a community of 473,378 software developers and data experts.

Speeding Up Communication With Access

I've created an application that downloads data daily from a secure
web site and stores that data in an Access database. Then there are
different options for allowing the user to do keyword and date
searches on the database and have the information displayed for them.

Everything looks and functions great, my only real dissatisfaction
with my application is the update time, which in my last test took
about 45-46 minutes for 9800 records.

When the update occurs, generally three steps are performed. The data
is downloaded as a single XML string, the XML string is parsed, and
then the new records are added to the database. I add all of the
records to a temporary table (one by one) from VB.NET using INSERT
INTO and then have a couple of queries that merge the temporary table
data into the final table.

I've have determined that the download is taking roughly 8% of the
process time, the parsing is taking about 4%, and the communication
with the database is taking 88% (45-46 minutes) of the time.

This 45 minutes almost exclusively consist of INSERT INTO statements
adding records one at a time in the following way...

...string sSQL (INSERT INTO blah blah blah) is constructed
right before this...

myCnxn = New OleDbConnection(sDBCnxnString)
cmUsrSQL = New OleDbCommand(sSQL, myCnxn)
myCnxn.Open()
iDummy = cmUsrSQL.ExecuteNonQuery
myCnxn.Close()

...the next sSQL is constructed and the process is repeated...

Is there a way to add records in mass? If I'm using INSERT INTO, can
I add more than one record with a single SQL statement?

Just now as I type this I'm wondering how much time I'm wasting
opening and closing and opening and closing...

Any links to any articles on this topic would be greatly appreciated.

Thx.

Oct 31 '07 #1
10 1288
2 things to note.
1. DO NOT OPEN AND CLOSE THE CONNECTION. Open it once and perform the data
insert as required
2. Use a StringBuilder to build your SQL. This could help with repeated
iterations.

Regards,

Trevor Benedict
MCSD

"ags5406" <my**********@gmail.comwrote in message
news:11**********************@d55g2000hsg.googlegr oups.com...
I've created an application that downloads data daily from a secure
web site and stores that data in an Access database. Then there are
different options for allowing the user to do keyword and date
searches on the database and have the information displayed for them.

Everything looks and functions great, my only real dissatisfaction
with my application is the update time, which in my last test took
about 45-46 minutes for 9800 records.

When the update occurs, generally three steps are performed. The data
is downloaded as a single XML string, the XML string is parsed, and
then the new records are added to the database. I add all of the
records to a temporary table (one by one) from VB.NET using INSERT
INTO and then have a couple of queries that merge the temporary table
data into the final table.

I've have determined that the download is taking roughly 8% of the
process time, the parsing is taking about 4%, and the communication
with the database is taking 88% (45-46 minutes) of the time.

This 45 minutes almost exclusively consist of INSERT INTO statements
adding records one at a time in the following way...

...string sSQL (INSERT INTO blah blah blah) is constructed
right before this...

myCnxn = New OleDbConnection(sDBCnxnString)
cmUsrSQL = New OleDbCommand(sSQL, myCnxn)
myCnxn.Open()
iDummy = cmUsrSQL.ExecuteNonQuery
myCnxn.Close()

...the next sSQL is constructed and the process is repeated...

Is there a way to add records in mass? If I'm using INSERT INTO, can
I add more than one record with a single SQL statement?

Just now as I type this I'm wondering how much time I'm wasting
opening and closing and opening and closing...

Any links to any articles on this topic would be greatly appreciated.

Thx.

Oct 31 '07 #2
On Oct 31, 2:53 pm, ags5406 <my.spam.5...@gmail.comwrote:
I've created an application that downloads data daily from a secure
web site and stores that data in an Access database. Then there are
different options for allowing the user to do keyword and date
searches on the database and have the information displayed for them.

Everything looks and functions great, my only real dissatisfaction
with my application is the update time, which in my last test took
about 45-46 minutes for 9800 records.

When the update occurs, generally three steps are performed. The data
is downloaded as a single XML string, the XML string is parsed, and
then the new records are added to the database. I add all of the
records to a temporary table (one by one) from VB.NET using INSERT
INTO and then have a couple of queries that merge the temporary table
data into the final table.

I've have determined that the download is taking roughly 8% of the
process time, the parsing is taking about 4%, and the communication
with the database is taking 88% (45-46 minutes) of the time.

This 45 minutes almost exclusively consist of INSERT INTO statements
adding records one at a time in the following way...

...string sSQL (INSERT INTO blah blah blah) is constructed
right before this...

myCnxn = New OleDbConnection(sDBCnxnString)
cmUsrSQL = New OleDbCommand(sSQL, myCnxn)
myCnxn.Open()
iDummy = cmUsrSQL.ExecuteNonQuery
myCnxn.Close()

...the next sSQL is constructed and the process is repeated...

Assumeing that the sql string is basically the same, just different
paramertes... Psuedo code:

using connection as oledbconnection = new oledbconnection
(connectionstring)
using command as oledbcommand = new oledbcommand ("insert into
mytable (column1, column2, column3) values (@value1, @value2,
@value3)", connection)

for each record in myrecordstoinsert
command.parameters("value1").value = value1
command.parameters("value2").value = value2
command.parameters("value3").value = value3

command.executenonquery()
next
end using

anyway... something like that :) i hope there's no C# in there :)

--
Tom Shelton

Oct 31 '07 #3

"Trevor Benedict" <Tr********@yahoo.comwrote in message news:eH**************@TK2MSFTNGP04.phx.gbl...
>2 things to note.
1. DO NOT OPEN AND CLOSE THE CONNECTION. Open it once and perform the data insert as required
2. Use a StringBuilder to build your SQL. This could help with repeated iterations.
"ags5406" <my**********@gmail.comwrote in message news:11**********************@d55g2000hsg.googlegr oups.com...
>I've created an application that downloads data daily from a secure
web site and stores that data in an Access database. Then there are
different options for allowing the user to do keyword and date
searches on the database and have the information displayed for them.

Everything looks and functions great, my only real dissatisfaction
with my application is the update time, which in my last test took
about 45-46 minutes for 9800 records.

10 years ago on a 486-66 I used to get 15000 rows per second on a local HD.

The above advice is good, reuse the connection.

Use transactions to do batches, I recall 1000 being good. There used to be
(I think) a 64k limit on commands.

Also, running a local table is much faster. IE Create a local db, fire off
some Create Table commands, then insert your data locally. Finally,
do a Select Into to firehouse the data into your remote server. This
also works very well in reverse, for creating reports. Select some data
into a local table, massage it, then dump it into a report. Delete the
local database when done.

You should be 100% disk bound on this..
Nov 1 '07 #4
Tom,

I am not sure if this is really speeding up the actions, probably only the
writting of the code. AFAIK is the "using" a kind of generic code, which has
more in it then if you write it yourself. In this case the dispose of the
connection, which removes the connectionstring first from the connection
before clossing (for the rest is the dispose in this case only overloaded
with the close, this for in version 1.1 for the latter I have never
investigated this).

Cor

Nov 1 '07 #5
Tom,

I took your intention wrong, the loop is of course the right way.

Cor
Nov 1 '07 #6
ags,

The Jet database is in fact not build for what you are doing. It was great
to use because it was free and the other free database from Microsoft MSDE
was terrible to use.

However, now there is SQL Express, you can use that, it really speeds up
your process.

And then take the sample from Tom. What you can do as well is creating a
sproc (Stored Procedure), it has no timebenefit in seldom done processes.
However the SQL server has build in a kind of caching were it stores the
latest used sprocs in a precompiled version (this is not done for all sprocs
before you understand this wrong).

Cor

Nov 1 '07 #7
"ags5406" <my**********@gmail.comschrieb
Everything looks and functions great, my only real dissatisfaction
with my application is the update time, which in my last test took
about 45-46 minutes for 9800 records.
If you need it really fast, use VB6+DAO. Use the AddNew/Update for Recordset
objects and you will be /multiple/ times faster than using INSERT for each
row. Unfortunatelly, in VB.Net we are forced to use ADO.net which is really
slow and a step back in this area. I wouldn't recommend DAO Interop due to
the COM Interop and programming overhead.
Armin

Nov 1 '07 #8
Thanks everyone for the suggestions. I am moving forward with some
success.

So I originally had this code...

myCnxn = New OleDbConnection(sDBCnxnString)
cmUsrSQL = New OleDbCommand(sSQL, myCnxn)
myCnxn.Open()
iDummy = cmUsrSQL.ExecuteNonQuery
myCnxn.Close()

.... in Sub execSQLNonQuery(sSQL as string)

The purpose was to build a string anywhere in the code, for whatever
purpose, and send the string to this one subroutine to execute to DB
part. I kind of kept it the same but with the key difference being
that I moved the open and close out of this subroutine. Now
the .Open() occurs when in Main() and the .Close() occurs when the
program exits. The connection to the DB is always open. I ran the
same test from yesterday (almost the same number of records today) and
saw a huge reduction in the update time - between 16 and 17 minutes
savings (~37%).

So that's good. But I still want to speed it up even more.

Now the downloading from the secure site is taking about 12% of the
time, parsing is taking 6%, and updating the local db is taking 82% of
the time. My biggest opportunity for optimization still appears to be
in the updating of the local database.

I still need to figure out how to add multiple records at a time
rather using INSERT INTO over and over for each record.

Robert - you may have been leading me in the right direction. You
mention getting 15000 rows per second back in the day? Just FYI my
target table has 20 fields, 14 of which are text (of varying max
lengths) and 6 are datetime. I think I understand sending the create
table commands. But then wouldn't I just still be adding rows one at
a time to my new table? Or can I create the table with the data all
at once?

Nov 1 '07 #9
On Nov 1, 9:18 am, ags5406 <my.spam.5...@gmail.comwrote:
Thanks everyone for the suggestions. I am moving forward with some
success.

So I originally had this code...

myCnxn = New OleDbConnection(sDBCnxnString)
cmUsrSQL = New OleDbCommand(sSQL, myCnxn)
myCnxn.Open()
iDummy = cmUsrSQL.ExecuteNonQuery
myCnxn.Close()

... in Sub execSQLNonQuery(sSQL as string)

The purpose was to build a string anywhere in the code, for whatever
purpose, and send the string to this one subroutine to execute to DB
part. I kind of kept it the same but with the key difference being
that I moved the open and close out of this subroutine. Now
the .Open() occurs when in Main() and the .Close() occurs when the
program exits. The connection to the DB is always open. I ran the
same test from yesterday (almost the same number of records today) and
saw a huge reduction in the update time - between 16 and 17 minutes
savings (~37%).

So that's good. But I still want to speed it up even more.

Now the downloading from the secure site is taking about 12% of the
time, parsing is taking 6%, and updating the local db is taking 82% of
the time. My biggest opportunity for optimization still appears to be
in the updating of the local database.

I still need to figure out how to add multiple records at a time
rather using INSERT INTO over and over for each record.

Robert - you may have been leading me in the right direction. You
mention getting 15000 rows per second back in the day? Just FYI my
target table has 20 fields, 14 of which are text (of varying max
lengths) and 6 are datetime. I think I understand sending the create
table commands. But then wouldn't I just still be adding rows one at
a time to my new table? Or can I create the table with the data all
at once?
I think what robert is suggesting, is to create a mirror of your table
on the local machine. Then, you do your single inserts into that
table. Once you've inserted all of your data to the local database -
you then do a select into form the local table to the real database
table....

Does that make sense?

--
Tom Shelton

Nov 1 '07 #10
On Nov 1, 10:52 am, Tom Shelton <tom_shel...@comcast.netwrote:
On Nov 1, 9:18 am, ags5406 <my.spam.5...@gmail.comwrote:
Thanks everyone for the suggestions. I am moving forward with some
success.
So I originally had this code...
myCnxn = New OleDbConnection(sDBCnxnString)
cmUsrSQL = New OleDbCommand(sSQL, myCnxn)
myCnxn.Open()
iDummy = cmUsrSQL.ExecuteNonQuery
myCnxn.Close()
... in Sub execSQLNonQuery(sSQL as string)
The purpose was to build a string anywhere in the code, for whatever
purpose, and send the string to this one subroutine to execute to DB
part. I kind of kept it the same but with the key difference being
that I moved the open and close out of this subroutine. Now
the .Open() occurs when in Main() and the .Close() occurs when the
program exits. The connection to the DB is always open. I ran the
same test from yesterday (almost the same number of records today) and
saw a huge reduction in the update time - between 16 and 17 minutes
savings (~37%).
So that's good. But I still want to speed it up even more.
Now the downloading from the secure site is taking about 12% of the
time, parsing is taking 6%, and updating the local db is taking 82% of
the time. My biggest opportunity for optimization still appears to be
in the updating of the local database.
I still need to figure out how to add multiple records at a time
rather using INSERT INTO over and over for each record.
Robert - you may have been leading me in the right direction. You
mention getting 15000 rows per second back in the day? Just FYI my
target table has 20 fields, 14 of which are text (of varying max
lengths) and 6 are datetime. I think I understand sending the create
table commands. But then wouldn't I just still be adding rows one at
a time to my new table? Or can I create the table with the data all
at once?

I think what robert is suggesting, is to create a mirror of your table
on the local machine. Then, you do your single inserts into that
table. Once you've inserted all of your data to the local database -
you then do a select into form the local table to the real database
table....

Does that make sense?

--
Tom Shelton
Actually, yes, that does make sense. When I started the project I was
keeping the DB on my machine. I've since moved it to a remote
server. I didn't quite understand what he was saying because my brain
had forgotten I was storing it remotely. Thanks.

Nov 1 '07 #11

This thread has been closed and replies have been disabled. Please start a new discussion.

Similar topics

3
by: John D | last post by:
We have a dynamic SP that dependant on a user name will run a selected tailored to them. One of the criteria is the number of rows retrieved, which we include using 'top @varNoOfRows' in the...
3
by: Bryan Christopher | last post by:
Hello All! I have a rather abstract question for some genius out there to answer. I want to integrate communication tracking, for customer relations, into an existing Access DB. What I was going...
2
by: Wayne | last post by:
I am running a complex query using about 25 criteria that are entered on a query form. If any individual criteria isn't required its field is left as "*" on the form. When I run the query the...
8
by: Stephen K. Young | last post by:
If you have not seen it, this recent Microsoft Knowledge Base article is worth reading: http://support.microsoft.com/kb/889588 "How to optimize Office Access and Jet database engine...
13
by: Krzysztof Bartosiewicz | last post by:
Hi! I have a problem with an access database. The database which I designed and made using MS access works perfectly OK on my computer however it is not working at all on computers for which it...
2
by: Leonardo D'Ippolito | last post by:
Hi! I have two .NET win apps that need to communicate on a TCP/IP network. 'App A' must ask 'app B' if it's allowed to do some task, and 'app B' must authorize or prohibit it. How can I do...
7
by: Michael Butscher | last post by:
Hi, this is not really Python-specific but I need it for Python. I'm wanting a method for interprocess communication which is OS- independent (sockets would be the normal way to go), but which...
1
by: hardikpatelmca | last post by:
Hi everybody, I want to create web application using C# and live communication server. User can set his presence information, view other presence information and send instant messages. How can I...
5
by: AeonOfTime | last post by:
Let's assume a web application (in this case a browser-based game) with a custom HTTP server built on PHP, and a client also built on PHP. The client uses the server to access and change data. Even...
0
by: Faith0G | last post by:
I am starting a new it consulting business and it's been a while since I setup a new website. Is wordpress still the best web based software for hosting a 5 page website? The webpages will be...
0
isladogs
by: isladogs | last post by:
The next Access Europe User Group meeting will be on Wednesday 3 Apr 2024 starting at 18:00 UK time (6PM UTC+1) and finishing by 19:30 (7.30PM). In this session, we are pleased to welcome former...
0
by: ryjfgjl | last post by:
In our work, we often need to import Excel data into databases (such as MySQL, SQL Server, Oracle) for data analysis and processing. Usually, we use database tools like Navicat or the Excel import...
0
by: Charles Arthur | last post by:
How do i turn on java script on a villaon, callus and itel keypad mobile phone
0
by: aa123db | last post by:
Variable and constants Use var or let for variables and const fror constants. Var foo ='bar'; Let foo ='bar';const baz ='bar'; Functions function $name$ ($parameters$) { } ...
0
by: ryjfgjl | last post by:
In our work, we often receive Excel tables with data in the same format. If we want to analyze these data, it can be difficult to analyze them because the data is spread across multiple Excel files...
0
by: emmanuelkatto | last post by:
Hi All, I am Emmanuel katto from Uganda. I want to ask what challenges you've faced while migrating a website to cloud. Please let me know. Thanks! Emmanuel
1
by: nemocccc | last post by:
hello, everyone, I want to develop a software for my android phone for daily needs, any suggestions?
0
by: Hystou | last post by:
There are some requirements for setting up RAID: 1. The motherboard and BIOS support RAID configuration. 2. The motherboard has 2 or more available SATA protocol SSD/HDD slots (including MSATA, M.2...

By using Bytes.com and it's services, you agree to our Privacy Policy and Terms of Use.

To disable or enable advertisements and analytics tracking please visit the manage ads & tracking page.