If at all possible I would read more than 202 characters at a time.
I’m going to guess that the size of the each record you want to read from
your file is 202 characters long. If we assume a 1 gigabyte file of those
records (1,073,741,824 charactors/bytes long), you have roughly 5,315,553
such records. Reading a single record at a time requires 5.3 million separate
accesses to the disk.
On the other hand, if you increase the size of each read, and then parse out
that data you save yourself a huge amount of work... for example, lets say
you read 10 records in at a time... you bring the required # of separate disk
accesses to just ~300 thousand, much better than 5.3 million.
Increase the read size by another 10 fold and you drop your disk accesses
down to ~30 thousand times.
Depending on the amount of memory available, feel free to play around with
adjusting the amount of data you read each time. Granted, the larger amount
you read reduces the amount number of times you have to hit the disk... it
also increases the memory requirements of your application and the
possibility of other slowdowns. Keep testing and tweaking it until you get it
right, or at least as fast as is acceptable.
Just remember, disk access is one of the slowest forms of I/O you can do on
a computer.
Brendan
"ha*******@gmail.com" wrote:
I followed your intruction but the process is so slow now.
using (Stream stream = System.IO.File.OpenRead(fileName))
{
using (StreamReader streamReader = new
StreamReader(stream, System.Text.Encoding.ASCII))
{
char[] buffer = new char[202];
int read = 0;
while (streamReader.Peek() > -1)
{
read = streamReader.Read(buffer, 0, 202);
}
}
}