Hi,
I have to process a very "wide" CSV file. Basically, the file does not
appear correctly in Notepad, WordPad etc because each line is 414 characters
wide. Ordinarily, I would have read the file into a StreamReader and
processed each line in turn e.g.
objSR = new StreamReader(strFileSpec);
while ((strLineIn = objSR.ReadLine()) != null)
{
// process the line
}
However, in this case, ReadLine() does not return the full 414 characters
which go to make up each line.
What are my options here? Should I use ReadBlock()? Or is a StreamReader not
even the correct object to do this sort of thing?
FYI, the file is downloaded from a legacy VMS mainframe via FTP, and it
appears that this process is inserting arbitrary carriage-returns because of
the line length. Excel also cannot import it correctly because of this,
although SQL Server's DTS can...
Also, if I reformat the file manually so that all the columns "line up", as
it were, I can process it normally with the ReadLine() method. Maybe the
answer is to remove all of the carriage-returns first, insert one every 414
characters and then process the file as normal?
Any assistance gratefully received.
Mark