By using this site, you agree to our updated Privacy Policy and our Terms of Use. Manage your Cookies Settings.
435,360 Members | 3,011 Online
Bytes IT Community
+ Ask a Question
Need help? Post your question and get tips & solutions from a community of 435,360 IT Pros & Developers. It's quick & easy.

When adding lots of records, do you still use INSERT INTO?

P: n/a
My goal is to get data from an XML file into a couple of tables in an Access
database. The XML file is a little complex so I need control over what I do
(I can't just read it into a dataset).

The way I have it now is ennumerating through my XML file and for each
record, running an INSERT INTO SQL statement to put the values in the
database.

Is this going to be okay with 50 or so records (hitting the database with 50
or so INSERT commands for each record) or one of the other methods:

1) continue using an INSERT INTO statement for each record or
2) use an ADO recordset or
3) make a datatable from the XML, (but it is kind of complex though) and
then sync it up with the database.

Thanks for the help!
Mike
Nov 17 '05 #1
Share this Question
Share on Google+
4 Replies


P: n/a
I'd say the Insert Into statements are a worthy choice.
You might choose to batch them together to reduce network traffic and speed
up the performance.

--
I hope this helps,
Steve C. Orr, MCSD, MVP
http://Steve.Orr.net
Hire top-notch developers at http://www.able-consulting.com

"Mike Hnatt" <do**@gladstone-inc.com> wrote in message
news:vq************@corp.supernews.com...
My goal is to get data from an XML file into a couple of tables in an Access database. The XML file is a little complex so I need control over what I do (I can't just read it into a dataset).

The way I have it now is ennumerating through my XML file and for each
record, running an INSERT INTO SQL statement to put the values in the
database.

Is this going to be okay with 50 or so records (hitting the database with 50 or so INSERT commands for each record) or one of the other methods:

1) continue using an INSERT INTO statement for each record or
2) use an ADO recordset or
3) make a datatable from the XML, (but it is kind of complex though) and
then sync it up with the database.

Thanks for the help!
Mike

Nov 17 '05 #2

P: n/a
Thanks Steve,
After reading a bit, I realized that the best strategy was to take the
table(s) from the database and load them into a dataset (but to keep the
dataset small, I used SELECT * FROM MyTable WHERE 1=2 so as to have no
records). I then set the command builder to an insertcommand. Then all I
had to do was load my tables in the dataset with the info, then when all was
ready I could make one update to the dataset. Pretty neat for .NET! (as
opposed to the old recordsets in ADO).
Mike
"Steve C. Orr [MVP, MCSD]" <St***@Orr.net> wrote in message
news:Ob**************@TK2MSFTNGP11.phx.gbl...
I'd say the Insert Into statements are a worthy choice.
You might choose to batch them together to reduce network traffic and speed up the performance.

--
I hope this helps,
Steve C. Orr, MCSD, MVP
http://Steve.Orr.net
Hire top-notch developers at http://www.able-consulting.com

"Mike Hnatt" <do**@gladstone-inc.com> wrote in message
news:vq************@corp.supernews.com...
My goal is to get data from an XML file into a couple of tables in an Access
database. The XML file is a little complex so I need control over what

I do
(I can't just read it into a dataset).

The way I have it now is ennumerating through my XML file and for each
record, running an INSERT INTO SQL statement to put the values in the
database.

Is this going to be okay with 50 or so records (hitting the database
with 50
or so INSERT commands for each record) or one of the other methods:

1) continue using an INSERT INTO statement for each record or
2) use an ADO recordset or
3) make a datatable from the XML, (but it is kind of complex though) and
then sync it up with the database.

Thanks for the help!
Mike


Nov 17 '05 #3

P: n/a
Just a side technical note, as you have already solved the problem
correctly. The only way to do multiple record inserts is to insert each
record individually. When you use a DataSet to do multiple inserts, it does
them individually, but automatically. That makes it, as you've noticed, an
excellent tool for the job.

--
HTH,

Kevin Spencer
Microsoft MVP
..Net Developer
http://www.takempis.com
Big Things are made up of
Lots of Little Things.

"Mike Hnatt" <do**@gladstone-inc.com> wrote in message
news:vq************@corp.supernews.com...
Thanks Steve,
After reading a bit, I realized that the best strategy was to take the
table(s) from the database and load them into a dataset (but to keep the
dataset small, I used SELECT * FROM MyTable WHERE 1=2 so as to have no
records). I then set the command builder to an insertcommand. Then all I
had to do was load my tables in the dataset with the info, then when all was ready I could make one update to the dataset. Pretty neat for .NET! (as
opposed to the old recordsets in ADO).
Mike
"Steve C. Orr [MVP, MCSD]" <St***@Orr.net> wrote in message
news:Ob**************@TK2MSFTNGP11.phx.gbl...
I'd say the Insert Into statements are a worthy choice.
You might choose to batch them together to reduce network traffic and speed
up the performance.

--
I hope this helps,
Steve C. Orr, MCSD, MVP
http://Steve.Orr.net
Hire top-notch developers at http://www.able-consulting.com

"Mike Hnatt" <do**@gladstone-inc.com> wrote in message
news:vq************@corp.supernews.com...
My goal is to get data from an XML file into a couple of tables in an

Access
database. The XML file is a little complex so I need control over what I
do
(I can't just read it into a dataset).

The way I have it now is ennumerating through my XML file and for each
record, running an INSERT INTO SQL statement to put the values in the
database.

Is this going to be okay with 50 or so records (hitting the database

with
50
or so INSERT commands for each record) or one of the other methods:

1) continue using an INSERT INTO statement for each record or
2) use an ADO recordset or
3) make a datatable from the XML, (but it is kind of complex though)

and then sync it up with the database.

Thanks for the help!
Mike



Nov 17 '05 #4

P: n/a
I have nothing against this dataset method. It certainly does make things
simple.
But in many cases you can get better performance by batching multiple insert
statements together into one long command string. That way only one
statement is sent across your network as opposed to N number of individual
statements.
I guess it depends on your priorities. Simplicity vs. Performance.
Of course both methods get significantly more complex when multiple user
concurrency issues are involved.

--
I hope this helps,
Steve C. Orr, MCSD, MVP
http://Steve.Orr.net
Hire top-notch developers at http://www.able-consulting.com

"Kevin Spencer" <ke***@DIESPAMMERSDIEtakempis.com> wrote in message
news:OY**************@TK2MSFTNGP11.phx.gbl...
Just a side technical note, as you have already solved the problem
correctly. The only way to do multiple record inserts is to insert each
record individually. When you use a DataSet to do multiple inserts, it does them individually, but automatically. That makes it, as you've noticed, an
excellent tool for the job.

--
HTH,

Kevin Spencer
Microsoft MVP
.Net Developer
http://www.takempis.com
Big Things are made up of
Lots of Little Things.

"Mike Hnatt" <do**@gladstone-inc.com> wrote in message
news:vq************@corp.supernews.com...
Thanks Steve,
After reading a bit, I realized that the best strategy was to take the
table(s) from the database and load them into a dataset (but to keep the
dataset small, I used SELECT * FROM MyTable WHERE 1=2 so as to have no
records). I then set the command builder to an insertcommand. Then all I
had to do was load my tables in the dataset with the info, then when all

was
ready I could make one update to the dataset. Pretty neat for .NET! (as opposed to the old recordsets in ADO).
Mike
"Steve C. Orr [MVP, MCSD]" <St***@Orr.net> wrote in message
news:Ob**************@TK2MSFTNGP11.phx.gbl...
I'd say the Insert Into statements are a worthy choice.
You might choose to batch them together to reduce network traffic and

speed
up the performance.

--
I hope this helps,
Steve C. Orr, MCSD, MVP
http://Steve.Orr.net
Hire top-notch developers at http://www.able-consulting.com

"Mike Hnatt" <do**@gladstone-inc.com> wrote in message
news:vq************@corp.supernews.com...
> My goal is to get data from an XML file into a couple of tables in an Access
> database. The XML file is a little complex so I need control over

what
I
do
> (I can't just read it into a dataset).
>
> The way I have it now is ennumerating through my XML file and for each > record, running an INSERT INTO SQL statement to put the values in the > database.
>
> Is this going to be okay with 50 or so records (hitting the database

with
50
> or so INSERT commands for each record) or one of the other methods:
>
> 1) continue using an INSERT INTO statement for each record or
> 2) use an ADO recordset or
> 3) make a datatable from the XML, (but it is kind of complex though)

and > then sync it up with the database.
>
> Thanks for the help!
> Mike
>
>



Nov 17 '05 #5

This discussion thread is closed

Replies have been disabled for this discussion.