On 23 Oct, 15:19, "Marc Gravell" <marc.grav...@gmail.comwrote:
A lot of factors could be in play here. Trying to use enormous arrays
is a big contender. Accidentally holding on to large arrays (directly
or indirectly) will compound this.
Where possibly, the most memory-efficiet way to deal with large binary
data is using a streamed approach - whether that means to/from a
file/database/network etc depends on the scenario. But it means only
having to handle a small buffer of working data.
Can you give any more information on your setup? Size of byte[]? Any
chance you are keeping old instances alive? (perhaps subscribing to
long-lived events, such as static events). Would streaming be an
option, etc.
Marc
Mark, Thanks for prompt reply.
Here is the sample code snippet.
I am using SQLReport Server 2005 to get the data.
Byte[] results;
ReportExecutionService rsExec;
results = rsExec.Render(format, deviceInfo,
out extension, out encode,
out mimeType, out warnings, out streamIDs);
SavePDF(results, filename); // calling SavePDF method to save the
result array output in PDF form.
private bool SavePDF(Byte[] results, string fileName)
{
try
{
using (FileStream stream = File.OpenWrite(fileName))
{
stream.Write(results, 0, results.Length);
stream.Close();
stream.Dispose();
}
return true;
}
catch(Exception ex)
{
throw;
}
}
Now,
1If I get large stream from ReportExecutionService then how to split
it in chunks and create a pdf file(say pdf has something around 3000 /
5000 pages in it)
Thanks in Advance.
Manoj