Peter S. wrote:
I am pulling some data from a source via ODBC and placing the
information
in a DataSet. The first pull is very large but once that is complete
I plan to do nightly pulls to get any new data that gets put in the
(remote) table.
I can't seem to get past that initial (big) pull of data, as I get
OutOfMemory exceptions. I took a look back at when this occurs and it
seems to happen upon stuffing the DataSet with either the 2097153
record or the 4194305 record.
My question is, is there a physical limitation to a Dataset as these
numbers are fairly consistent numbers? BTW, I have upgraded the
server to have 4GB
of memory and have two swap files with 4MB's each. I suspect this
would be enough memory? Perhaps the swap files aren't in play/use
when appending
data to a DataSet?
So the key question is if I can do anything to allow the DataSet to
store more memory? I have probably done as much as I can to give as
much memory to the OS. (I am using V2.0.50727.42 of .NET)....
AFIAK, there's no specific limit to DataSet per-se - you're likely facing
the overall limit of the 2Gb user address space available to programs under
Windows.
A couple of things that should help:
1. Change the boot.ini of the server to include the /3Gb switch and mark
your executable as Large Address Aware. This decreases the amount of memory
available to the kernel, so you might not want to leave it like this, but it
could get you past this first hurdle.
2. Move to a 64-bit machine.
3. Change your design to pull the data over in smaller chunks.
-cd