473,395 Members | 2,468 Online
Bytes | Software Development & Data Engineering Community
Post Job

Home Posts Topics Members FAQ

Join Bytes to post your question to a community of 473,395 software developers and data experts.

Getting data from XML file to an array/db

Hi All

Wondered if you could help.

I have created a little backup routine (using classic ASP and ADO) to get my
data from my db into a bespoke XML file, ie I query the data into a
recordset, concat the XML tags with this data and then put it into a text
file using FileSysObj.

This backup proc all works fine, but the problem is when I go to restore
this backup. I'm using the xml dom object to get the dataset by tagname and
then building up the 'insert into' statements by concating it with the
xml-based data, but my ASP scripting is always timing out by the time this
process is complete on any dataset with a large amount of data in it.

PLEASE NOTE: my scripting timeout is set to 15 secs by my host and I cannot
change this or my ISP at this current time.

The backup/restore process all falls into place really nicely, apart from
the fact that I can't get the XML restore bit to work fast enough to get the
data back into the db. For example, I had an xml file consisting of 6000+
rows, which contained 18 cols/fields per row. I did a simple test of
extracting this data using 2 x FOR NEXT loops and writing the field/tag
values to the page and the ASP couldn't complete this before the timeout.
Its as if the XML object/process is too slow to handle this amount of data.
Is this the case?

When I've done the ADO's recordset save option to binary this works like
lightning and it's xml offering isn't too much slower as well, but I don't
like the way the binary/xml file is put together because I can't add the
queries to the same file, ie using the ADO binary/xml method I have about 22
separate files, whereas my bespoke xml file is just 1 file with all the data
in.

Other users are going to be backing up my db via an ASP page, so they only
backup the data I want them to back up (hence the reason why I don't just
use a direct sql backup program), but I seem to be resigned to the fact that
I will only be able to use the ADO version instead of my own.

Anybody had this problem and got round it?

Thanks
Apr 16 '07 #1
1 5119
Newbie wrote:
Hi All

Wondered if you could help.

I have created a little backup routine (using classic ASP and ADO) to
get my data from my db into a bespoke XML file, ie I query the data
into a recordset, concat the XML tags with this data and then put it
into a text file using FileSysObj.
You've gone to a lot of unnecessary trouble. You can use the recordset Save
method to save a recordset to an xml file. Later on, you can use the
recordset open method to open a recordset on that xml file.
Oh! reading on, I now see you know about Save ... Open. I would say that you
do not need FildSysObj to save the xml to a file. Use an xml document object
to create your xml (see the data island example here for an example:
http://www.davidpenton.com/testsite/tips/). Use the Save method to save the
xml document to a file.
>
This backup proc all works fine, but the problem is when I go to
restore this backup. I'm using the xml dom object to get the dataset
by tagname and then building up the 'insert into' statements by
concating it with the xml-based data, but my ASP scripting is always
timing out by the time this process is complete on any dataset with a
large amount of data in it.

PLEASE NOTE: my scripting timeout is set to 15 secs by my host and I
cannot change this or my ISP at this current time.

The backup/restore process all falls into place really nicely, apart
from the fact that I can't get the XML restore bit to work fast
enough to get the data back into the db. For example, I had an xml
file consisting of 6000+ rows, which contained 18 cols/fields per
row. I did a simple test of extracting this data using 2 x FOR NEXT
loops and writing the field/tag values to the page and the ASP
couldn't complete this before the timeout. Its as if the XML
object/process is too slow to handle this amount of data. Is this the
case?
Yes, it probably is. Nobody ever said parsing an xml file would be fast.
>
When I've done the ADO's recordset save option to binary this works
like lightning and it's xml offering isn't too much slower as well,
but I don't like the way the binary/xml file is put together because
I can't add the queries to the same file, ie using the ADO binary/xml
method I have about 22 separate files, whereas my bespoke xml file is
just 1 file with all the data in.

Other users are going to be backing up my db via an ASP page, so they
only backup the data I want them to back up (hence the reason why I
don't just use a direct sql backup program), but I seem to be
resigned to the fact that I will only be able to use the ADO version
instead of my own.

Anybody had this problem and got round it?
Nope. It would never have occurred to me to handle 6000+ rows in an xml
document, let alone in an ASP page. This is the wrong technology for the job
IMO. Don't forget that xml is extremely bulky, so not only are you fighting
CPU usage, you are also consuming memory, much more than your host would
like I'm sure. There's a very good reason your host has the timeout set to
15 sec.

--
Microsoft MVP - ASP/ASP.NET
Please reply to the newsgroup. This email account is my spam trap so I
don't check it very often. If you must reply off-line, then remove the
"NO SPAM"
Apr 16 '07 #2

This thread has been closed and replies have been disabled. Please start a new discussion.

Similar topics

4
by: PHPkemon | last post by:
Hi there, A few weeks ago I made a post and got an answer which seemed very logical. Here's part of the post: PHPkemon wrote: > I think I've figured out how to do the main things like...
2
by: Tek9_AK | last post by:
I need to find a way to transfer all the values of an array inside a function out of the fuction into another array. IE function splice($filename){ if(file_exists($filename)){...
0
by: ruju00 | last post by:
I am getting an error in Login() method of the following class FtpConnection public class FtpConnection { public class FtpException : Exception { public FtpException(string message) :...
13
by: Ørjan Langbakk | last post by:
I wanna make a file that holds the complete pricelist for a small webshop (yes, I know that a database in the background would be a lot simpler, but that is not an option today, unfortunately). ...
6
by: nephish | last post by:
Hey there all. i have been looking to simplify my huge website that i wrote while learning php. now its a spaghetti mess. So, i wanted to simplify it. Now, i see the functionality that defining...
10
by: eggie5 | last post by:
Is it possible to get a file without using a form post? I want to get the data (bytes) of a file, text or binary, and just save it to a variable. Similar to the post body of a form that has a...
1
davydany
by: davydany | last post by:
Hey guys...a n00b Here for this site. I'm making a sequence class for my C++ class. And The thing is in the array that I have, lets say i put in {13,17,38,18}, when i see the current values for the...
6
by: sgottenyc | last post by:
Hello, If you could assist me with the following situation, I would be very grateful. I have a table of data retrieved from database displayed on screen. To each row of data, I have added...
6
by: sathyashrayan | last post by:
Dear Group, Please look at the following demo link. http://www.itsravi.com/demo/new_pms/admin/addproject.php
2
by: pedalpete | last post by:
I've got a php page which is outputing xml to a file. The code has been working without a hitch for weeks when running as localhost, but a few days ago I uploaded it to my ec2 instance, and it...
0
by: ryjfgjl | last post by:
In our work, we often receive Excel tables with data in the same format. If we want to analyze these data, it can be difficult to analyze them because the data is spread across multiple Excel files...
0
by: emmanuelkatto | last post by:
Hi All, I am Emmanuel katto from Uganda. I want to ask what challenges you've faced while migrating a website to cloud. Please let me know. Thanks! Emmanuel
0
BarryA
by: BarryA | last post by:
What are the essential steps and strategies outlined in the Data Structures and Algorithms (DSA) roadmap for aspiring data scientists? How can individuals effectively utilize this roadmap to progress...
1
by: Sonnysonu | last post by:
This is the data of csv file 1 2 3 1 2 3 1 2 3 1 2 3 2 3 2 3 3 the lengths should be different i have to store the data by column-wise with in the specific length. suppose the i have to...
0
by: Hystou | last post by:
Most computers default to English, but sometimes we require a different language, especially when relocating. Forgot to request a specific language before your computer shipped? No problem! You can...
0
Oralloy
by: Oralloy | last post by:
Hello folks, I am unable to find appropriate documentation on the type promotion of bit-fields when using the generalised comparison operator "<=>". The problem is that using the GNU compilers,...
0
jinu1996
by: jinu1996 | last post by:
In today's digital age, having a compelling online presence is paramount for businesses aiming to thrive in a competitive landscape. At the heart of this digital strategy lies an intricately woven...
0
by: Hystou | last post by:
Overview: Windows 11 and 10 have less user interface control over operating system update behaviour than previous versions of Windows. In Windows 11 and 10, there is no way to turn off the Windows...
0
tracyyun
by: tracyyun | last post by:
Dear forum friends, With the development of smart home technology, a variety of wireless communication protocols have appeared on the market, such as Zigbee, Z-Wave, Wi-Fi, Bluetooth, etc. Each...

By using Bytes.com and it's services, you agree to our Privacy Policy and Terms of Use.

To disable or enable advertisements and analytics tracking please visit the manage ads & tracking page.