By using this site, you agree to our updated Privacy Policy and our Terms of Use. Manage your Cookies Settings.
434,805 Members | 1,427 Online
Bytes IT Community
+ Ask a Question
Need help? Post your question and get tips & solutions from a community of 434,805 IT Pros & Developers. It's quick & easy.

C# memory consumption outta control...

P: 4
Hello Gentlemen..

I have a program that is calculating all possible permutations of objects within an object.


The code looks very much like...
Expand|Select|Wrap|Line Numbers
  1. foreach (item as currentItem in Items)
  2.   {
  3.      MasterObject.Item1 = currentItem;
  4.     foreach (item2 as currentItem2 in Items2)
  5.      {
  6.        MasterObject.Item2 = currentItem2;
  7.       foreach (item3 as currentItem3 in Items3)
  8.        {
  9.          MasterObject.Item3 = currentItem3;     
  10.          foreach (item4 as currentItem4 in Items4)
  11.          {
  12.  
And so forth...I have nested this 20 levels deep, to find all possible permutations.

With about 30 objects spread through item1, item2, etc..about 250MB of RAM. If I add another object, especially in the lower levels of the foreachit might swell the memory usage from 250MB to 300MB.

However, if I add an extra item to Items2, it cascades down the foreach and exponentially increases the permutations. When that occurs it spikes well over a GB of memory. The problem is, I need to add many more objects to Items1, Items2, etc...

I thought C# would be intelligent enough to manage this memory usage, instead its like it is trying to keep all possible permutations in RAM as it processes the foreach nest.

Is there anything I can do to stop this? Or is there a more efficient mechanism to search through those permutations?

At the bottom of the foreach nest, if the masterObject matches certain criteria, I am doing a deep copy to a MatchObjects arraylist.

If it does not match, I am using the same MasterObject, so its not like I am creating a new master object each time through...

Any solutions or ideas would be greatly appreciated.
Feb 1 '09 #1
Share this Question
Share on Google+
8 Replies


Plater
Expert 5K+
P: 7,872
Well I'm not clear on every aspect, but maybe it's not getting a chance to run garbage collection?
Somewhere in your nesting add a GC.Collect() and see if that effects the memory consumption?
Feb 2 '09 #2

P: 4
Hey thanks for the response Plater. I will give that a shot and let you guys know if it resolves the issue.
Feb 2 '09 #3

vekipeki
Expert 100+
P: 229
Note that neither GC nor you can free the memory, if it is actually being used.

What I am trying to say is: it looks like you have 4 collections of items, which point to other objects. When you assign an item to a property (MasterObject.Item1 = currentItem), you are not creating a new instance or your currentItem, so this is not where your memory starts to grow.

So, if your memory consumption is low before beginning of this code, it means that you are allocating lots of stuff somewhere else inside the nested code?
Feb 3 '09 #4

10K+
P: 13,264
20 levels of foreach nesting doesn't sound clean. Perhaps you can refactor your code (perhaps by creating some classes here and there) and tidying up the design a bit?
Feb 3 '09 #5

P: 4
Hello Everyone,

First of all I would like to say thank you for your posts, and apologize for not responding more promptly. Unfortunately this was a program which was not high priority and it took me a while to find the solution.

What was happening is this, I was storing objects for successful hits to certain criteria out of a high growth permutation. What happened is that I was storing all possible matches, then sorting through them at the end and determining which results *best* met other sets of criteria.

In truth, out of hundreds of billions of permutations I only expected a few thousand in the final match collection.

What did happen is a huge amount of matches being returned and stored. In this case, I was seeing the memory grow because of the sheer number of successful matches.

Once I discovered this, I simply moved my end filtering process to the center of the loop, and there was no longer a reason to store and sort all that data anymore. It sped up the routine greatly and now the footprint is very small.

Again, thanks for everyone's help.
Feb 11 '09 #6

P: 4
@r035198x
Hey man, thanks for the reply. I am all about refactoring.

The nested foreach loops were a mechanism to derive all possible permutations of objects out of a set.

Think of it as...

SLOT1 (Item1,Item2,Item3,Item4,Item5)
SLOT2(Item6,Item7,Item8,Item9,Item10)
SLOT3(Item11,Item12,Item13,Item14,Item15)

out to 20 slots...

And you want to get every possible permutation of Item1 in SLOT1, with Item6 in SLOT2, but also Item7 in SLOT2, then go back and iterate SLOT1 to Item2 and then go back to SLOT2, Item6, etc..)

Just like counting binary...:)

Is there any way could could think of a more efficient way? I am sure there is some formula I could implement that would speed the process, some type of statistics similar to PPM or CTW...??
Feb 11 '09 #7

10K+
P: 13,264
I'll stick it on my to do list and have a look later today after I've had my coffee. In the meantime you can PM the Java village idiot JosAH and ask him to have a look at this thread. His eyes light up every time he senses a combinatorial problem.
Feb 11 '09 #8

vekipeki
Expert 100+
P: 229
It depends on you "filtering process", as you called it - i.e. how do you decide if a permutation should be discarded or not, and how early can you make that decision?

For example: someone told you to print all positive numbers smaller than 1,000,000 which start with a specified digit (0 to 9). A naive approach would be to count from 0 to 1,000,000 and make a check for each number separately. This would mean you have to make a check 1,000,000 times.

But if you rethink, you can realize that, if for example the specified digit is "4", those numbers can only be 4, 40-49, 400-499, ..., 400000-499999.

In that case your outer loop would only count the number of digits (from 1 to 6 in this case), and in each iteration you would generate your bounds based on the known criteria.

So, if you really must create all permutations to check them one by one, then there is no way to optimize it.
Feb 11 '09 #9

Post your reply

Sign in to post your reply or Sign up for a free account.