<ff*****@gmail. comwrote in message
news:11******** **************@ h48g2000cwc.goo glegroups.com.. .
Hi,
I would like to find out the fastest way to concatenate large (200 or
300MB) and numerous (500 - 700) files (Postscript files) into a single
one (which can ends up being several GigaB) using csharp within a
windows form application.
One of the requirement also is to make sure the output file is closed
only when the concatenation is done so that the next application using
the file (printer spool) does not assume it is finished unledd it is
actually finished.
Well, you don't really need the fastest way. You need a reasonably fast
way. Just use System.IO.FileS tream, open one for your output file, and
iterate over your input files. For each input file, read a buffer from the
input file, write the buffer to the output file, repeat.
You don't need to use huge buffers, and it's probably not worth while to use
async IO or double buffering or any of that.
Here's a simple file concatenation program:
David
using System;
using System.Collecti ons.Generic;
using System.Text;
using System.IO;
namespace FileConcat
{
class Program
{
static int Main(string[] args)
{
if (args.Length != 2)
{
Console.WriteLi ne("usage FileConcat [SourceDir] [DestinationFile]");
}
try
{
Run(args[0],args[1]);
return 0;
}
catch (Exception ex)
{
Console.WriteLi ne(ex);
return 1;
}
}
static void Run(string SourceDir, string OutputFileName)
{
string[] inputFiles = Directory.GetFi les(SourceDir);
int bufSize = 1024 * 64;
byte[] buf = new byte[bufSize];
using (FileStream outFile =
new FileStream(Outp utFileName, FileMode.OpenOr Create,
FileAccess.Writ e, FileShare.None, bufSize))
{
foreach (string inputFile in inputFiles)
{
using (FileStream inFile =
new FileStream(inpu tFile, FileMode.Open, FileAccess.Read ,
FileShare.Read, bufSize))
{
int br = 0;
while ((br = inFile.Read(buf ,0,buf.Length)) 0)
{
outFile.Write(b uf,0,br);
}
}
}
}
}
}
}