473,233 Members | 1,650 Online
Bytes | Software Development & Data Engineering Community
Post Job

Home Posts Topics Members FAQ

Join Bytes to post your question to a community of 473,233 software developers and data experts.

Retrieving Unique Items from a List

Hi,

I have a Generic Object that has a private List field item.

I populate the List in a different function

I use a FOREACH LOOP with a FindAll function to get all items that have a
certain match for data in the list and do something to it.

The problem is that it repeats for all data items in the list and not
exclude the ones already processed. How can I exclude the ones already
processed.

what I have is

foreach (item in the list)
mylist = list.findall(item);
foreach(myitem in mylist)
dosomething

the problem occurs in the outer foreach where it already has processed some
items from the original list. how do i exclude them?

Jun 27 '08 #1
3 2154
amir wrote:
Hi,

I have a Generic Object that has a private List field item.

I populate the List in a different function

I use a FOREACH LOOP with a FindAll function to get all items that
have a certain match for data in the list and do something to it.

The problem is that it repeats for all data items in the list and not
exclude the ones already processed. How can I exclude the ones
already processed.

what I have is

foreach (item in the list)
mylist = list.findall(item);
foreach(myitem in mylist)
dosomething

the problem occurs in the outer foreach where it already has
processed some items from the original list. how do i exclude them?
I am not sure that I 100% understand what you want to do here, but it
sounds kinda like you want some kind of group processing right?

so maybe something like (using linq)

var groupList = from item in itemList
group item on item.<theCriteria>;

foreach(IGrouping<CriteriaType,Itemgrp in groupList)
{
// create new tab or page or section break / or whatever with grp.key
foreach (Item item in grp)
{
// process each item in group.
}

}

Rgds Tim.
--

Jun 27 '08 #2
On Thu, 15 May 2008 15:40:02 -0700, amir <am**@discussions.microsoft.com
wrote:
[...]
foreach (item in the list)
mylist = list.findall(item);
foreach(myitem in mylist)
dosomething

the problem occurs in the outer foreach where it already has processed
some
items from the original list. how do i exclude them?
If you really must process your list items in groups according to your
FindAll() results, I think using LINQ as Tim suggests would work well.
However, I have to wonder why you want to do this. Nothing in the code
you posted indicates an actual need to do this, and it seems like you'd be
better off just enumerating the list and processing each element one by
one. The way you've shown it, you've basically got an O(N^2) algorithm
even if we assume you somehow address the duplicated item issue at no cost
(which isn't a realistic assumption).

If you can't use LINQ and you must process in groups, an alternative
solution would be to use a Dictionary where the key for the dictionary is
the same as whatever criteria you're using for the FindAll() search.
Then, when enumerating each item in the list, rather than doing work
during that enumeration, simply build up lists of elements in your
Dictionary based on that key, and then enumerate those lists later:

Dictionary<KeyType, List<ListItem>dict = new Dictionary<KeyType,
List<ListItem>>();

foreach (ListItem item in list)
{
List<ListItemlistDict;

if (!dict.TryGetValue(item.KeyProperty, out listDict))
{
listDict = new List<ListItem>();
dict.Add(item.KeyProperty, listDict);
}

listDict.Add(item);
}

foreach (List<ListItemlistItems in dict.Values)
{
foreach (ListItem item in listItems)
{
// do something
}
}

That's only O(N) instead of O(N^2) and IMHO makes it a bit more clear that
you specifically are trying to group the items before processing.

Pete
Jun 27 '08 #3
Hello Peter and Tim,

Linq was my first approach but I am bogged down with the users not having
the latest framework and and compatible machine. However, the TryGetValue is
the right solution.

Thanks much fellas.

"Peter Duniho" wrote:
On Thu, 15 May 2008 15:40:02 -0700, amir <am**@discussions.microsoft.com>
wrote:
[...]
foreach (item in the list)
mylist = list.findall(item);
foreach(myitem in mylist)
dosomething

the problem occurs in the outer foreach where it already has processed
some
items from the original list. how do i exclude them?

If you really must process your list items in groups according to your
FindAll() results, I think using LINQ as Tim suggests would work well.
However, I have to wonder why you want to do this. Nothing in the code
you posted indicates an actual need to do this, and it seems like you'd be
better off just enumerating the list and processing each element one by
one. The way you've shown it, you've basically got an O(N^2) algorithm
even if we assume you somehow address the duplicated item issue at no cost
(which isn't a realistic assumption).

If you can't use LINQ and you must process in groups, an alternative
solution would be to use a Dictionary where the key for the dictionary is
the same as whatever criteria you're using for the FindAll() search.
Then, when enumerating each item in the list, rather than doing work
during that enumeration, simply build up lists of elements in your
Dictionary based on that key, and then enumerate those lists later:

Dictionary<KeyType, List<ListItem>dict = new Dictionary<KeyType,
List<ListItem>>();

foreach (ListItem item in list)
{
List<ListItemlistDict;

if (!dict.TryGetValue(item.KeyProperty, out listDict))
{
listDict = new List<ListItem>();
dict.Add(item.KeyProperty, listDict);
}

listDict.Add(item);
}

foreach (List<ListItemlistItems in dict.Values)
{
foreach (ListItem item in listItems)
{
// do something
}
}

That's only O(N) instead of O(N^2) and IMHO makes it a bit more clear that
you specifically are trying to group the items before processing.

Pete
Jun 27 '08 #4

This thread has been closed and replies have been disabled. Please start a new discussion.

Similar topics

2
by: kevin parks | last post by:
hi. I've been banging my head against this one a while and have asked around, and i am throwing this one out there in the hopes that some one can shed some light on what has turned out to be a...
22
by: Claudio Jolowicz | last post by:
Is it possible to store unique objects in an STL container? Suppose an object of class C is unique: class C { public: C() {} ~C() {} private:
0
by: Alistair | last post by:
Hi all, I am creating a database based site that keeps track of books, who has read them and the comments they have. After a little help in M.P.I.asp.DB I managed to create a database (access...
6
by: Dave Hopper | last post by:
Hi I am using the following SQL to retrieve a value in a list box using a unique ID held in the list box call cntID. The list box is used on an order form to list appointments that have been...
0
by: yeltsin27 | last post by:
I need some advice on handling dynamically added controls in a GridView. My app takes an uploaded CSV file containing addresses, converts it to a DataTable, databinds the DataTable to a...
11
by: garyhoran | last post by:
Hi Guys, I have a collection that contains various attributes (stuff like strings, DateTime and Timespan) . I would like to access the collection in various orders at different points in the...
3
by: mark4asp | last post by:
Stack of limited size containing unique items? Hi guys, How would I implement a stack of limited size containing unique items? For example. Suppose my stack has . I add 2 to it and it is now...
2
by: Vahagn | last post by:
Hi, how do I retrieve the chosen value from a RadioButtonList? I have a RadioButtonList that is populated dynamically with a "Next" button. Ie, I have a list of questions each with 2-3 answer...
1
by: Anjan Bhowmik | last post by:
Suppose i have a multi-select list box which loads data from a table in a SQL Server 2005 Database. This list box loads all rows from the table, which has around 2000 rows. So when databinding...
3
isladogs
by: isladogs | last post by:
The next Access Europe meeting will be on Wednesday 3 Jan 2024 starting at 18:00 UK time (6PM UTC) and finishing at about 19:15 (7.15PM). For other local times, please check World Time Buddy In...
0
by: abbasky | last post by:
### Vandf component communication method one: data sharing ​ Vandf components can achieve data exchange through data sharing, state sharing, events, and other methods. Vandf's data exchange method...
2
isladogs
by: isladogs | last post by:
The next Access Europe meeting will be on Wednesday 7 Feb 2024 starting at 18:00 UK time (6PM UTC) and finishing at about 19:30 (7.30PM). In this month's session, the creator of the excellent VBE...
0
by: fareedcanada | last post by:
Hello I am trying to split number on their count. suppose i have 121314151617 (12cnt) then number should be split like 12,13,14,15,16,17 and if 11314151617 (11cnt) then should be split like...
0
by: stefan129 | last post by:
Hey forum members, I'm exploring options for SSL certificates for multiple domains. Has anyone had experience with multi-domain SSL certificates? Any recommendations on reliable providers or specific...
1
by: davi5007 | last post by:
Hi, Basically, I am trying to automate a field named TraceabilityNo into a web page from an access form. I've got the serial held in the variable strSearchString. How can I get this into the...
0
by: DolphinDB | last post by:
The formulas of 101 quantitative trading alphas used by WorldQuant were presented in the paper 101 Formulaic Alphas. However, some formulas are complex, leading to challenges in calculation. Take...
0
by: DolphinDB | last post by:
Tired of spending countless mintues downsampling your data? Look no further! In this article, you’ll learn how to efficiently downsample 6.48 billion high-frequency records to 61 million...
0
isladogs
by: isladogs | last post by:
The next Access Europe meeting will be on Wednesday 6 Mar 2024 starting at 18:00 UK time (6PM UTC) and finishing at about 19:15 (7.15PM). In this month's session, we are pleased to welcome back...

By using Bytes.com and it's services, you agree to our Privacy Policy and Terms of Use.

To disable or enable advertisements and analytics tracking please visit the manage ads & tracking page.