By using this site, you agree to our updated Privacy Policy and our Terms of Use. Manage your Cookies Settings.
424,688 Members | 1,229 Online
Bytes IT Community
+ Ask a Question
Need help? Post your question and get tips & solutions from a community of 424,688 IT Pros & Developers. It's quick & easy.

Getting data from XML file to an array/db

P: n/a
Hi All

Wondered if you could help.

I have created a little backup routine (using classic ASP and ADO) to get my
data from my db into a bespoke XML file, ie I query the data into a
recordset, concat the XML tags with this data and then put it into a text
file using FileSysObj.

This backup proc all works fine, but the problem is when I go to restore
this backup. I'm using the xml dom object to get the dataset by tagname and
then building up the 'insert into' statements by concating it with the
xml-based data, but my ASP scripting is always timing out by the time this
process is complete on any dataset with a large amount of data in it.

PLEASE NOTE: my scripting timeout is set to 15 secs by my host and I cannot
change this or my ISP at this current time.

The backup/restore process all falls into place really nicely, apart from
the fact that I can't get the XML restore bit to work fast enough to get the
data back into the db. For example, I had an xml file consisting of 6000+
rows, which contained 18 cols/fields per row. I did a simple test of
extracting this data using 2 x FOR NEXT loops and writing the field/tag
values to the page and the ASP couldn't complete this before the timeout.
Its as if the XML object/process is too slow to handle this amount of data.
Is this the case?

When I've done the ADO's recordset save option to binary this works like
lightning and it's xml offering isn't too much slower as well, but I don't
like the way the binary/xml file is put together because I can't add the
queries to the same file, ie using the ADO binary/xml method I have about 22
separate files, whereas my bespoke xml file is just 1 file with all the data
in.

Other users are going to be backing up my db via an ASP page, so they only
backup the data I want them to back up (hence the reason why I don't just
use a direct sql backup program), but I seem to be resigned to the fact that
I will only be able to use the ADO version instead of my own.

Anybody had this problem and got round it?

Thanks
Apr 16 '07 #1
Share this Question
Share on Google+
1 Reply


P: n/a
Newbie wrote:
Hi All

Wondered if you could help.

I have created a little backup routine (using classic ASP and ADO) to
get my data from my db into a bespoke XML file, ie I query the data
into a recordset, concat the XML tags with this data and then put it
into a text file using FileSysObj.
You've gone to a lot of unnecessary trouble. You can use the recordset Save
method to save a recordset to an xml file. Later on, you can use the
recordset open method to open a recordset on that xml file.
Oh! reading on, I now see you know about Save ... Open. I would say that you
do not need FildSysObj to save the xml to a file. Use an xml document object
to create your xml (see the data island example here for an example:
http://www.davidpenton.com/testsite/tips/). Use the Save method to save the
xml document to a file.
>
This backup proc all works fine, but the problem is when I go to
restore this backup. I'm using the xml dom object to get the dataset
by tagname and then building up the 'insert into' statements by
concating it with the xml-based data, but my ASP scripting is always
timing out by the time this process is complete on any dataset with a
large amount of data in it.

PLEASE NOTE: my scripting timeout is set to 15 secs by my host and I
cannot change this or my ISP at this current time.

The backup/restore process all falls into place really nicely, apart
from the fact that I can't get the XML restore bit to work fast
enough to get the data back into the db. For example, I had an xml
file consisting of 6000+ rows, which contained 18 cols/fields per
row. I did a simple test of extracting this data using 2 x FOR NEXT
loops and writing the field/tag values to the page and the ASP
couldn't complete this before the timeout. Its as if the XML
object/process is too slow to handle this amount of data. Is this the
case?
Yes, it probably is. Nobody ever said parsing an xml file would be fast.
>
When I've done the ADO's recordset save option to binary this works
like lightning and it's xml offering isn't too much slower as well,
but I don't like the way the binary/xml file is put together because
I can't add the queries to the same file, ie using the ADO binary/xml
method I have about 22 separate files, whereas my bespoke xml file is
just 1 file with all the data in.

Other users are going to be backing up my db via an ASP page, so they
only backup the data I want them to back up (hence the reason why I
don't just use a direct sql backup program), but I seem to be
resigned to the fact that I will only be able to use the ADO version
instead of my own.

Anybody had this problem and got round it?
Nope. It would never have occurred to me to handle 6000+ rows in an xml
document, let alone in an ASP page. This is the wrong technology for the job
IMO. Don't forget that xml is extremely bulky, so not only are you fighting
CPU usage, you are also consuming memory, much more than your host would
like I'm sure. There's a very good reason your host has the timeout set to
15 sec.

--
Microsoft MVP - ASP/ASP.NET
Please reply to the newsgroup. This email account is my spam trap so I
don't check it very often. If you must reply off-line, then remove the
"NO SPAM"
Apr 16 '07 #2

This discussion thread is closed

Replies have been disabled for this discussion.