473,387 Members | 1,517 Online
Bytes | Software Development & Data Engineering Community
Post Job

Home Posts Topics Members FAQ

Join Bytes to post your question to a community of 473,387 software developers and data experts.

working with large files


When you have to read a big file (5-30MB) and throw the data into the
database, ofcourse some logics inbetween (doesn't matter) which of the ADO
methods is recommended.
1. read line by line and do the execute for each line
2. Consolidate the files into a dataset for example and use the dataset
update method
ie
for each line
.addnew in the dataset
end
dataset.Update( .Added)

I mean which one should we prefer in these situations, will it make any
difference in performance. Are there any limitations on how many rows can be
inserted using the dataset.Update method ?

Thanks much
Chris
Sep 19 '05 #1
3 1393
You can also get all the statements you need into one large statement by
separating each one by semicolon, and then execute this once.

The Update method of the adapter does the updates one by one anyway, so I
don't imagine that would be any faster then any of the other methods. It
would still require round trips to the database, and have a lot of
processing overhead.

You should experiment to see what works best for your particular situation.

"Chris" <Ch***@discussions.microsoft.com> wrote in message
news:18**********************************@microsof t.com...

When you have to read a big file (5-30MB) and throw the data into the
database, ofcourse some logics inbetween (doesn't matter) which of the ADO
methods is recommended.
1. read line by line and do the execute for each line
2. Consolidate the files into a dataset for example and use the dataset
update method
ie
for each line
.addnew in the dataset
end
dataset.Update( .Added)

I mean which one should we prefer in these situations, will it make any
difference in performance. Are there any limitations on how many rows can
be
inserted using the dataset.Update method ?

Thanks much
Chris

Sep 19 '05 #2
You will achieve far better performance if you read the file and process as
you go. Make sure to use objects that are "memory frugal" since you are
likely to do a lot of garbage collecting.

A dataset with 20MB of data in it is not a good use of datasets.

--
--- Nick Malik [Microsoft]
MCSD, CFPS, Certified Scrummaster
http://blogs.msdn.com/nickmalik

Disclaimer: Opinions expressed in this forum are my own, and not
representative of my employer.
I do not answer questions on behalf of my employer. I'm just a
programmer helping programmers.
--
"Chris" <Ch***@discussions.microsoft.com> wrote in message
news:18**********************************@microsof t.com...

When you have to read a big file (5-30MB) and throw the data into the
database, ofcourse some logics inbetween (doesn't matter) which of the ADO
methods is recommended.
1. read line by line and do the execute for each line
2. Consolidate the files into a dataset for example and use the dataset
update method
ie
for each line
.addnew in the dataset
end
dataset.Update( .Added)

I mean which one should we prefer in these situations, will it make any
difference in performance. Are there any limitations on how many rows can
be
inserted using the dataset.Update method ?

Thanks much
Chris

Sep 21 '05 #3
Thanks guys for the feedback.

Thank you
Chris

"Nick Malik [Microsoft]" wrote:
You will achieve far better performance if you read the file and process as
you go. Make sure to use objects that are "memory frugal" since you are
likely to do a lot of garbage collecting.

A dataset with 20MB of data in it is not a good use of datasets.

--
--- Nick Malik [Microsoft]
MCSD, CFPS, Certified Scrummaster
http://blogs.msdn.com/nickmalik

Disclaimer: Opinions expressed in this forum are my own, and not
representative of my employer.
I do not answer questions on behalf of my employer. I'm just a
programmer helping programmers.
--
"Chris" <Ch***@discussions.microsoft.com> wrote in message
news:18**********************************@microsof t.com...

When you have to read a big file (5-30MB) and throw the data into the
database, ofcourse some logics inbetween (doesn't matter) which of the ADO
methods is recommended.
1. read line by line and do the execute for each line
2. Consolidate the files into a dataset for example and use the dataset
update method
ie
for each line
.addnew in the dataset
end
dataset.Update( .Added)

I mean which one should we prefer in these situations, will it make any
difference in performance. Are there any limitations on how many rows can
be
inserted using the dataset.Update method ?

Thanks much
Chris


Sep 24 '05 #4

This thread has been closed and replies have been disabled. Please start a new discussion.

Similar topics

10
by: Lorn Davies | last post by:
Hi there, I'm a Python newbie hoping for some direction in working with text files that range from 100MB to 1G in size. Basically certain rows, sorted by the first (primary) field maybe second...
10
by: comp.lang.php | last post by:
echo mime_content_type('/var/www/html/video/small.mov'); // 1.5mb Quicktime video returns "video/quicktime" echo mime_content_type('/var/www/html/video/huge.mov'); // 10.5mb Quicktime video...
0
by: wschlichtman | last post by:
I am running into speed issues when attempting to work with larger image files, e.g., 5 M/Pixel photos. I have been experimenting with in-memory working images which are scaled down to a more...
6
by: syed javid | last post by:
hi all, I am getting an error when i process 1 MB txt file using StreamReader and i am reading char by char using streamReader.Read() method at some point it is throwing an exception...
3
by: Buddy Ackerman | last post by:
I'm trying to write files directly to the client so that it forces the client to open the Save As dialog box rather than display the file. On some occasions the files are very large (100MB+). On...
3
by: A.M-SG | last post by:
Hi, I have a ASP.NET aspx file that needs to pass large images from a network storage to client browser. The requirement is that users cannot have access to the network share. The aspx file...
10
by: erin.sebastian | last post by:
Hello Everyone, I have code that uploads files in asp. It seems to be working however on files > 200kb it bombs and i don't know why. Does anyone have any idea of why this would occur and what i...
2
by: jdev8080 | last post by:
We are looking at creating large XML files containing binary data (encoded as base64) and passing them to transformers that will parse and transform the data into different formats. Basically,...
8
by: theCancerus | last post by:
Hi All, I am not sure if this is the right place to ask this question but i am very sure you may have faced this problem, i have already found some post related to this but not the answer i am...
0
by: taylorcarr | last post by:
A Canon printer is a smart device known for being advanced, efficient, and reliable. It is designed for home, office, and hybrid workspace use and can also be used for a variety of purposes. However,...
0
by: ryjfgjl | last post by:
If we have dozens or hundreds of excel to import into the database, if we use the excel import function provided by database editors such as navicat, it will be extremely tedious and time-consuming...
0
by: ryjfgjl | last post by:
In our work, we often receive Excel tables with data in the same format. If we want to analyze these data, it can be difficult to analyze them because the data is spread across multiple Excel files...
0
by: emmanuelkatto | last post by:
Hi All, I am Emmanuel katto from Uganda. I want to ask what challenges you've faced while migrating a website to cloud. Please let me know. Thanks! Emmanuel
1
by: nemocccc | last post by:
hello, everyone, I want to develop a software for my android phone for daily needs, any suggestions?
0
by: Hystou | last post by:
There are some requirements for setting up RAID: 1. The motherboard and BIOS support RAID configuration. 2. The motherboard has 2 or more available SATA protocol SSD/HDD slots (including MSATA, M.2...
0
marktang
by: marktang | last post by:
ONU (Optical Network Unit) is one of the key components for providing high-speed Internet services. Its primary function is to act as an endpoint device located at the user's premises. However,...
0
Oralloy
by: Oralloy | last post by:
Hello folks, I am unable to find appropriate documentation on the type promotion of bit-fields when using the generalised comparison operator "<=>". The problem is that using the GNU compilers,...
0
jinu1996
by: jinu1996 | last post by:
In today's digital age, having a compelling online presence is paramount for businesses aiming to thrive in a competitive landscape. At the heart of this digital strategy lies an intricately woven...

By using Bytes.com and it's services, you agree to our Privacy Policy and Terms of Use.

To disable or enable advertisements and analytics tracking please visit the manage ads & tracking page.