I can help, but you'll have to give me a lot more details,
specifically, what the end goal of this process is. If you do wanna go
the datatable route, I can tell you that the datatable in 1.0 and 1.1
most likely will not handle the data memorywise. The 2.0 datatable can
handle the data, but will still probably take a couple hundred mb of
memory from what I've seen. I'm not sure how long it will take to load
the datatable, but I could see it taking upwards of a minute or more.
If you want to read from a csv file, you can try out my parser I sell,
http://www.csvreader.com . I'm just not sure how many of these records
you're trying to actually deal with at any point. If your issue is just
that you have 1.7 million rows of data in the db, and you're doing
queries against it for portions of the data, say like a couple hundred
rows, and that's not happening quickly, then your issue is most likely
in the db structure. I'd be looking at the indexing on the table vs
your query in that case.
Bruce Dunwiddie
Je***************@gmail.com wrote:
Peter,
I was having trouble getting the database to select the records quickly
(1.7million) so I thought that a csv file(essentiall the query
results) that was just read straight from disc (no query overhead),
might be more proficient.
Thanks for your comments
Jerry