473,387 Members | 3,810 Online
Bytes | Software Development & Data Engineering Community
Post Job

Home Posts Topics Members FAQ

Join Bytes to post your question to a community of 473,387 software developers and data experts.

Populating DataTable - OutOfMemory Exception

Hi all

I'm populating a DataTable with around 1,500,000 rows, the table contains 4
columns, 3 string columns and a decimal column.

However, I keep getting OutOfMemory exceptions when my app starts to reach
around 700MB memory (this is a console app by the way).

So, question is, why does the DT eat so much memory and how can I avoid
these OutOfMemory exceptions? I "need" all of these rows available because
they will be referenced throughout the lifecycle of my app.

Thanks
Kev
Aug 2 '07 #1
8 13319
Hello Mantorok,
Hi all

I'm populating a DataTable with around 1,500,000 rows, the table
contains 4 columns, 3 string columns and a decimal column.

However, I keep getting OutOfMemory exceptions when my app starts to
reach around 700MB memory (this is a console app by the way).

So, question is, why does the DT eat so much memory and how can I
avoid these OutOfMemory exceptions? I "need" all of these rows
available because they will be referenced throughout the lifecycle of
my app.

Thanks
Kev

DataTable contains much more than just teh values. And it is made up from
a lot of different classes that also remain in memory (DataColumn, DataRows,
etc). Why not use a simpler construct.

public class MyImportantThing
{
private string _string1;
private string _string2;
private string _string3;
private decimal _decimal1;

public MuImportantThing(string s1, string s2, string s3, decimal d1)
{
_string1 = s1;
...
...
_decimal1 = d1;
}

// Properties here
}

And then instantiate a List<MyImportantThingthis should be much faster.
If you need find/select capabilities you can add these easily if you create
your own list type:

public class MyImportantThingCollection: List<MyImportantThing>
{
public MyImportantThing FindByString1(string s1)
{
foreach (MyImportantThing mip in this)
{
if (string.Equals(mip.String1, s1)
{
return mip;
}
}
}
}

If you need fast access through one of the properties of you class, you could
always decide to use a Disctionary<>

public class MyImportantThingDictionary: Dictionary<MyImportantThing>
{
}

This will have a much lower memory footprint.

I would also consider a completely different scenario... Don't load a list
of 1500000 items at all and get the item you need from a database when you
need them. That insures you only load the rows you absolutely need. You could
even load the item when it's actually being used, instead of pre-loading
them all. But I guess you have your reasons.

Jesse
Aug 2 '07 #2
Hi Jesse

Thanks for your response, it has prompted me to re-think why I need all
these records, and because of that I've just realised I don't need all of
the records because I could be filtering them at the DB anyway.

Anyhow, the solution regarding custom classes wouldn't apply to my
situation, the DataTables are being returned from 7 web services that we
have in 7 of our remote data sources (the WS takes an SQL statement and
returns the results).

But thanks anyway, I'm going to refactor my code now and should have a much
more efficient solution.

Thanks
Kev

"Jesse Houwing" <Je***********@nospam-sogeti.nlwrote in message
news:33*************************@news.microsoft.co m...
Hello Mantorok,
>Hi all

I'm populating a DataTable with around 1,500,000 rows, the table
contains 4 columns, 3 string columns and a decimal column.

However, I keep getting OutOfMemory exceptions when my app starts to
reach around 700MB memory (this is a console app by the way).

So, question is, why does the DT eat so much memory and how can I
avoid these OutOfMemory exceptions? I "need" all of these rows
available because they will be referenced throughout the lifecycle of
my app.

Thanks
Kev


DataTable contains much more than just teh values. And it is made up from
a lot of different classes that also remain in memory (DataColumn,
DataRows, etc). Why not use a simpler construct.

public class MyImportantThing
{
private string _string1;
private string _string2;
private string _string3;
private decimal _decimal1;

public MuImportantThing(string s1, string s2, string s3, decimal d1)
{
_string1 = s1;
...
...
_decimal1 = d1;
}

// Properties here
}

And then instantiate a List<MyImportantThingthis should be much faster.
If you need find/select capabilities you can add these easily if you
create your own list type:

public class MyImportantThingCollection: List<MyImportantThing>
{
public MyImportantThing FindByString1(string s1)
{
foreach (MyImportantThing mip in this)
{
if (string.Equals(mip.String1, s1)
{
return mip;
}
}
}
}

If you need fast access through one of the properties of you class, you
could always decide to use a Disctionary<>

public class MyImportantThingDictionary: Dictionary<MyImportantThing>
{
}

This will have a much lower memory footprint.
I would also consider a completely different scenario... Don't load a list
of 1500000 items at all and get the item you need from a database when you
need them. That insures you only load the rows you absolutely need. You
could even load the item when it's actually being used, instead of
pre-loading them all. But I guess you have your reasons.

Jesse


Aug 2 '07 #3
Hello Mantorok,
Hi Jesse

Thanks for your response, it has prompted me to re-think why I need
all these records, and because of that I've just realised I don't need
all of the records because I could be filtering them at the DB anyway.

Anyhow, the solution regarding custom classes wouldn't apply to my
situation, the DataTables are being returned from 7 web services that
we have in 7 of our remote data sources (the WS takes an SQL statement
and returns the results).

But thanks anyway, I'm going to refactor my code now and should have a
much more efficient solution.

Thanks
Kev
You're welcome

You could still put the contents of teh datatables in a custom class. You
could then throw away the datatables.

Jesse
"Jesse Houwing" <Je***********@nospam-sogeti.nlwrote in message
news:33*************************@news.microsoft.co m...
>Hello Mantorok,
>>Hi all

I'm populating a DataTable with around 1,500,000 rows, the table
contains 4 columns, 3 string columns and a decimal column.

However, I keep getting OutOfMemory exceptions when my app starts to
reach around 700MB memory (this is a console app by the way).

So, question is, why does the DT eat so much memory and how can I
avoid these OutOfMemory exceptions? I "need" all of these rows
available because they will be referenced throughout the lifecycle
of my app.

Thanks
Kev
DataTable contains much more than just teh values. And it is made up
from a lot of different classes that also remain in memory
(DataColumn, DataRows, etc). Why not use a simpler construct.

public class MyImportantThing
{
private string _string1;
private string _string2;
private string _string3;
private decimal _decimal1;
public MuImportantThing(string s1, string s2, string s3, decimal d1)
{
_string1 = s1;
...
...
_decimal1 = d1;
}
// Properties here
}
And then instantiate a List<MyImportantThingthis should be much
faster. If you need find/select capabilities you can add these easily
if you create your own list type:

public class MyImportantThingCollection: List<MyImportantThing>
{
public MyImportantThing FindByString1(string s1)
{
foreach (MyImportantThing mip in this)
{
if (string.Equals(mip.String1, s1)
{
return mip;
}
}
}
}
If you need fast access through one of the properties of you class,
you could always decide to use a Disctionary<>

public class MyImportantThingDictionary: Dictionary<MyImportantThing>
{
}
This will have a much lower memory footprint.
I would also consider a completely different scenario... Don't load a
list
of 1500000 items at all and get the item you need from a database
when you
need them. That insures you only load the rows you absolutely need.
You
could even load the item when it's actually being used, instead of
pre-loading them all. But I guess you have your reasons.
Jesse

Aug 2 '07 #4
You're welcome
>
You could still put the contents of teh datatables in a custom class.
You could then throw away the datatables.
True, however I have a function that queries all 7 sources for me and stuffs
all the results into 1 fat table, and this is where it runs out of memory.
If I created custom classes I would have to pretty much re-write this function
for this 1 purpose, which isn't favourable.

It was sloppy of me not to restrict the data-set coming back, I should know
better but I never thought about it for this particular task.

Kev
Aug 2 '07 #5
Hello Mantorok,
>You're welcome

You could still put the contents of teh datatables in a custom class.
You could then throw away the datatables.
True, however I have a function that queries all 7 sources for me and
stuffs
all the results into 1 fat table, and this is where it runs out of
memory.
If I created custom classes I would have to pretty much re-write this
function
for this 1 purpose, which isn't favourable.
It was sloppy of me not to restrict the data-set coming back, I should
know better but I never thought about it for this particular task.

Kev
We all make mistakes ;). Good luck!

Jesse
Aug 2 '07 #6
HI,

In any case 1.5M of rows is TOO much for anything useful.

You better keep them in a DB.

Any operation will take a LONG time in a table with 1.5M rows.

"Mantorok" <no**@none.comwrote in message
news:f8**********@newsfeed.th.ifl.net...
Hi all

I'm populating a DataTable with around 1,500,000 rows, the table contains
4 columns, 3 string columns and a decimal column.

However, I keep getting OutOfMemory exceptions when my app starts to reach
around 700MB memory (this is a console app by the way).

So, question is, why does the DT eat so much memory and how can I avoid
these OutOfMemory exceptions? I "need" all of these rows available because
they will be referenced throughout the lifecycle of my app.

Thanks
Kev

Aug 2 '07 #7
HI,

"Mantorok" <no**@none.comwrote in message
news:f8**********@newsfeed.th.ifl.net...
Hi Jesse

Thanks for your response, it has prompted me to re-think why I need all
these records, and because of that I've just realised I don't need all of
the records because I could be filtering them at the DB anyway.

Anyhow, the solution regarding custom classes wouldn't apply to my
situation, the DataTables are being returned from 7 web services that we
have in 7 of our remote data sources (the WS takes an SQL statement and
returns the results).

But thanks anyway, I'm going to refactor my code now and should have a
much more efficient solution.
How ofter you run this?

I would suggest you to insert all those results in a local SQL DB and then
make a query to the local data.
Aug 2 '07 #8
HI,
>
In any case 1.5M of rows is TOO much for anything useful.

You better keep them in a DB.

Any operation will take a LONG time in a table with 1.5M rows.
Yep - I think my brain was on vacation the day I wrote this app ;)

I was going about it the completely wrong way, all this time I could've reduced
the result set and I didn't even think to, anyway it's all ok now.

Thanks
Kev
Aug 2 '07 #9

This thread has been closed and replies have been disabled. Please start a new discussion.

Similar topics

0
by: vMike | last post by:
If a website is running on a shared server and a system.outofmemory exception is thrown, is the error because of my website application or could it be because of anyone's website application. If it...
2
by: Ricky Chan | last post by:
Windows Server 2003 with 2G Ram ,IIS6 -> I have enabled /3gb switch in boot.ini ! -> Machine.config set to 80% memory Limit However, It still throw outofmemory exception when loading large...
1
by: Ricky Chan | last post by:
In the production environment, it always occurs and the worker process did not recycle automatically. Therefore, it make the system service break to client. In development environment, we write...
0
by: Joe Ross | last post by:
(Apologies in advance if there is a better forum for asking advice on this topic). Our ASP.NET application occasionally starts spitting out OutOfMemory exceptions. When this happens, the memory...
5
by: LP | last post by:
Hello, We running VB.NET application which gets massive amounts of data from SQL Server, loads data into DataTables, then re-arranges data into tabular structure and outputs it to a flat file....
2
by: Peter S. | last post by:
I am pulling some data from a source via ODBC and placing the information in a DataSet. The first pull is very large but once that is complete I plan to do nightly pulls to get any new data that...
3
by: Nemisis | last post by:
Hi everyone, Can someone please tell me why my code hits an "out of memory exception" on the below code? All the code does is load some documents from a SQL database and loop through a data...
2
by: =?Utf-8?B?TWlrZQ==?= | last post by:
Hi Guys, I have a real serious problem that stoped me doing any progress in my project. in one of my webpages I have a wizard of more then 13 pages and in every page some Ajax controls,...
0
by: njuneardave | last post by:
I am using a listview to hold a bunch of binary data (approx 25 rows and 700 cols). I have a compare function that compares each row's bit at a certain column. i then color the background of each...
0
by: taylorcarr | last post by:
A Canon printer is a smart device known for being advanced, efficient, and reliable. It is designed for home, office, and hybrid workspace use and can also be used for a variety of purposes. However,...
0
by: Charles Arthur | last post by:
How do i turn on java script on a villaon, callus and itel keypad mobile phone
0
by: aa123db | last post by:
Variable and constants Use var or let for variables and const fror constants. Var foo ='bar'; Let foo ='bar';const baz ='bar'; Functions function $name$ ($parameters$) { } ...
0
by: ryjfgjl | last post by:
In our work, we often receive Excel tables with data in the same format. If we want to analyze these data, it can be difficult to analyze them because the data is spread across multiple Excel files...
0
BarryA
by: BarryA | last post by:
What are the essential steps and strategies outlined in the Data Structures and Algorithms (DSA) roadmap for aspiring data scientists? How can individuals effectively utilize this roadmap to progress...
1
by: Sonnysonu | last post by:
This is the data of csv file 1 2 3 1 2 3 1 2 3 1 2 3 2 3 2 3 3 the lengths should be different i have to store the data by column-wise with in the specific length. suppose the i have to...
0
by: Hystou | last post by:
There are some requirements for setting up RAID: 1. The motherboard and BIOS support RAID configuration. 2. The motherboard has 2 or more available SATA protocol SSD/HDD slots (including MSATA, M.2...
0
marktang
by: marktang | last post by:
ONU (Optical Network Unit) is one of the key components for providing high-speed Internet services. Its primary function is to act as an endpoint device located at the user's premises. However,...
0
by: Hystou | last post by:
Most computers default to English, but sometimes we require a different language, especially when relocating. Forgot to request a specific language before your computer shipped? No problem! You can...

By using Bytes.com and it's services, you agree to our Privacy Policy and Terms of Use.

To disable or enable advertisements and analytics tracking please visit the manage ads & tracking page.