473,387 Members | 1,687 Online
Bytes | Software Development & Data Engineering Community
Post Job

Home Posts Topics Members FAQ

Join Bytes to post your question to a community of 473,387 software developers and data experts.

For vs. For Each

Is there a performance difference between this:

\\\
Dim i As Integer
For i = 0 to myObject.Controls.Count - 1
myObject.Controls(i) = ...
Next
///

and this:

\\\
Dim ctl As Control
For Each ctl In myObject.Controls
ctl = ...
Next
///

Or is For Each just "prettier"?

Thanks,

Eric
Jul 21 '05 #1
80 4957
Hi,

This was discussed a time ago and U think remember that there were almost
no differences,
beside that, this is VERY EASY to check just write an small program and
iterate both ways and see the results

and finally this is a VB.net question, not a C# one, there is no need to
post it on microsoft.public.dotnet.languages.csharp

cheers,

--
Ignacio Machin,
ignacio.machin AT dot.state.fl.us
Florida Department Of Transportation
<an*******@discussions.microsoft.com> wrote in message
news:OY**************@TK2MSFTNGP11.phx.gbl...
Is there a performance difference between this:

\\\
Dim i As Integer
For i = 0 to myObject.Controls.Count - 1
myObject.Controls(i) = ...
Next
///

and this:

\\\
Dim ctl As Control
For Each ctl In myObject.Controls
ctl = ...
Next
///

Or is For Each just "prettier"?

Thanks,

Eric

Jul 21 '05 #2
Hi,

This was discussed a time ago and U think remember that there were almost
no differences,
beside that, this is VERY EASY to check just write an small program and
iterate both ways and see the results

and finally this is a VB.net question, not a C# one, there is no need to
post it on microsoft.public.dotnet.languages.csharp

cheers,

--
Ignacio Machin,
ignacio.machin AT dot.state.fl.us
Florida Department Of Transportation
<an*******@discussions.microsoft.com> wrote in message
news:OY**************@TK2MSFTNGP11.phx.gbl...
Is there a performance difference between this:

\\\
Dim i As Integer
For i = 0 to myObject.Controls.Count - 1
myObject.Controls(i) = ...
Next
///

and this:

\\\
Dim ctl As Control
For Each ctl In myObject.Controls
ctl = ...
Next
///

Or is For Each just "prettier"?

Thanks,

Eric

Jul 21 '05 #3
In VB6 for a Collection you should use 'for each'. Here (.NET) it probably
doesn't matter.

I suppose you should use 'for each', because its cleaner and might use
optimizations whenever this will be possible. IHMO it looks cleaner too.

- Joris

<an*******@discussions.microsoft.com> wrote in message
news:OY**************@TK2MSFTNGP11.phx.gbl...
Is there a performance difference between this:

\\\
Dim i As Integer
For i = 0 to myObject.Controls.Count - 1
myObject.Controls(i) = ...
Next
///

and this:

\\\
Dim ctl As Control
For Each ctl In myObject.Controls
ctl = ...
Next
///

Or is For Each just "prettier"?

Thanks,

Eric


Jul 21 '05 #4
In VB6 for a Collection you should use 'for each'. Here (.NET) it probably
doesn't matter.

I suppose you should use 'for each', because its cleaner and might use
optimizations whenever this will be possible. IHMO it looks cleaner too.

- Joris

<an*******@discussions.microsoft.com> wrote in message
news:OY**************@TK2MSFTNGP11.phx.gbl...
Is there a performance difference between this:

\\\
Dim i As Integer
For i = 0 to myObject.Controls.Count - 1
myObject.Controls(i) = ...
Next
///

and this:

\\\
Dim ctl As Control
For Each ctl In myObject.Controls
ctl = ...
Next
///

Or is For Each just "prettier"?

Thanks,

Eric


Jul 21 '05 #5
There should be very little in the way of performance difference in these
two code snippets, especially as your control numbers are not likely to be
large anyway.

One point to note is that with For Each, you cannot remove the items or add
to the collection while iterating otherwise it complains, however with a for
loop, you can iterate backwards and remove items.


--

OHM ( Terry Burns )
. . . One-Handed-Man . . .
If U Need My Email ,Ask Me

Time flies when you don't know what you're doing

<an*******@discussions.microsoft.com> wrote in message
news:OY**************@TK2MSFTNGP11.phx.gbl...
Is there a performance difference between this:

\\\
Dim i As Integer
For i = 0 to myObject.Controls.Count - 1
myObject.Controls(i) = ...
Next
///

and this:

\\\
Dim ctl As Control
For Each ctl In myObject.Controls
ctl = ...
Next
///

Or is For Each just "prettier"?

Thanks,

Eric

Jul 21 '05 #6
There should be very little in the way of performance difference in these
two code snippets, especially as your control numbers are not likely to be
large anyway.

One point to note is that with For Each, you cannot remove the items or add
to the collection while iterating otherwise it complains, however with a for
loop, you can iterate backwards and remove items.


--

OHM ( Terry Burns )
. . . One-Handed-Man . . .
If U Need My Email ,Ask Me

Time flies when you don't know what you're doing

<an*******@discussions.microsoft.com> wrote in message
news:OY**************@TK2MSFTNGP11.phx.gbl...
Is there a performance difference between this:

\\\
Dim i As Integer
For i = 0 to myObject.Controls.Count - 1
myObject.Controls(i) = ...
Next
///

and this:

\\\
Dim ctl As Control
For Each ctl In myObject.Controls
ctl = ...
Next
///

Or is For Each just "prettier"?

Thanks,

Eric

Jul 21 '05 #7
Anders Hejlsberg: "Generally my answer is, Always use FOR EACH if you can,
because chances are you will have fewer bugs if you use FOR EACH. There are
just more pitfalls with the regular FOR statement. It is true that in some
cases the FOR statement is more efficient. I think the vast majority of
cases, I don’t think you would ever notice. My advice would be, always use
FOR EACH profile your app. If they turn out to be your problem, then change
them to FOR statements, but I don’t think you ever will in real code. I
think FOR EACH is much more expressive, and in theory allows us to optimize
your code more in the future. There are certain optimizations that we will
do, because we can tell that you’re going over the entire collection or
whatever, and so we could, at least in theory, generate even better code. I
would highly recommend FOR EACH unless you really do need the index."

http://msdn.microsoft.com/msdntv/epi...h/manifest.xml
Jul 21 '05 #8
Anders Hejlsberg: "Generally my answer is, Always use FOR EACH if you can,
because chances are you will have fewer bugs if you use FOR EACH. There are
just more pitfalls with the regular FOR statement. It is true that in some
cases the FOR statement is more efficient. I think the vast majority of
cases, I don’t think you would ever notice. My advice would be, always use
FOR EACH profile your app. If they turn out to be your problem, then change
them to FOR statements, but I don’t think you ever will in real code. I
think FOR EACH is much more expressive, and in theory allows us to optimize
your code more in the future. There are certain optimizations that we will
do, because we can tell that you’re going over the entire collection or
whatever, and so we could, at least in theory, generate even better code. I
would highly recommend FOR EACH unless you really do need the index."

http://msdn.microsoft.com/msdntv/epi...h/manifest.xml
Jul 21 '05 #9
Eric,
In addition to the other comments:

Are you asking about the Controls collection specifically or are you asking
any collection in general?

As each collection type has its own performance characteristics!
(ControlCollection, verses ArrayList, verses Collection, verses an Array,
verses HashTable, verses insert your favorite collection here).

I have to ask: Does it really matter which is faster?

I would not worry about which performs better, I would go with which one is
more straight forward, I find the For Each more straight forward, so that is
the one I favor.

Remember that most programs follow the 80/20 rule (link below) that is 80%
of the execution time of your program is spent in 20% of your code. I will
optimize (worry about performance) the 20% once that 20% has been identified
& proven to be a performance problem via profiling (CLR Profiler is one
profiling tool).

For info on the 80/20 rule & optimizing only the 20% see Martin Fowler's
article "Yet Another Optimization Article" at
http://martinfowler.com/ieeeSoftware...timization.pdf

Hope this helps
Jay

<an*******@discussions.microsoft.com> wrote in message
news:OY**************@TK2MSFTNGP11.phx.gbl...
Is there a performance difference between this:

\\\
Dim i As Integer
For i = 0 to myObject.Controls.Count - 1
myObject.Controls(i) = ...
Next
///

and this:

\\\
Dim ctl As Control
For Each ctl In myObject.Controls
ctl = ...
Next
///

Or is For Each just "prettier"?

Thanks,

Eric

Jul 21 '05 #10
Eric,
In addition to the other comments:

Are you asking about the Controls collection specifically or are you asking
any collection in general?

As each collection type has its own performance characteristics!
(ControlCollection, verses ArrayList, verses Collection, verses an Array,
verses HashTable, verses insert your favorite collection here).

I have to ask: Does it really matter which is faster?

I would not worry about which performs better, I would go with which one is
more straight forward, I find the For Each more straight forward, so that is
the one I favor.

Remember that most programs follow the 80/20 rule (link below) that is 80%
of the execution time of your program is spent in 20% of your code. I will
optimize (worry about performance) the 20% once that 20% has been identified
& proven to be a performance problem via profiling (CLR Profiler is one
profiling tool).

For info on the 80/20 rule & optimizing only the 20% see Martin Fowler's
article "Yet Another Optimization Article" at
http://martinfowler.com/ieeeSoftware...timization.pdf

Hope this helps
Jay

<an*******@discussions.microsoft.com> wrote in message
news:OY**************@TK2MSFTNGP11.phx.gbl...
Is there a performance difference between this:

\\\
Dim i As Integer
For i = 0 to myObject.Controls.Count - 1
myObject.Controls(i) = ...
Next
///

and this:

\\\
Dim ctl As Control
For Each ctl In myObject.Controls
ctl = ...
Next
///

Or is For Each just "prettier"?

Thanks,

Eric

Jul 21 '05 #11
Would you have been happier if Eric has written the question in C#? This is
very much as important a question in C# as it is in VB.NET.

foreach (Control ctl in myObject.Controls)
{
// do something useful with 'ctl'

}

I've had folks tell me that 'for' is more efficient than 'foreach' because
of enumerator overhead. For most of my code, however, this is a moot point.
Unless the code is in a critical loop, the difference in processing so tiny
that the improvement in code readability greatly outweighs the overhead of
allowing .NET to manipulate the enumerator.

--- Nick

"Ignacio Machin ( .NET/ C# MVP )" <ignacio.machin AT dot.state.fl.us> wrote
in message news:%2******************@TK2MSFTNGP11.phx.gbl...
<<clipped>>
and finally this is a VB.net question, not a C# one, there is no need to
post it on microsoft.public.dotnet.languages.csharp
<<clipped>>
<an*******@discussions.microsoft.com> wrote in message
news:OY**************@TK2MSFTNGP11.phx.gbl...
Is there a performance difference between this:

\\\
Dim i As Integer
For i = 0 to myObject.Controls.Count - 1
myObject.Controls(i) = ...
Next
///

and this:

\\\
Dim ctl As Control
For Each ctl In myObject.Controls
ctl = ...
Next
///

Or is For Each just "prettier"?

Thanks,

Eric


Jul 21 '05 #12
Would you have been happier if Eric has written the question in C#? This is
very much as important a question in C# as it is in VB.NET.

foreach (Control ctl in myObject.Controls)
{
// do something useful with 'ctl'

}

I've had folks tell me that 'for' is more efficient than 'foreach' because
of enumerator overhead. For most of my code, however, this is a moot point.
Unless the code is in a critical loop, the difference in processing so tiny
that the improvement in code readability greatly outweighs the overhead of
allowing .NET to manipulate the enumerator.

--- Nick

"Ignacio Machin ( .NET/ C# MVP )" <ignacio.machin AT dot.state.fl.us> wrote
in message news:%2******************@TK2MSFTNGP11.phx.gbl...
<<clipped>>
and finally this is a VB.net question, not a C# one, there is no need to
post it on microsoft.public.dotnet.languages.csharp
<<clipped>>
<an*******@discussions.microsoft.com> wrote in message
news:OY**************@TK2MSFTNGP11.phx.gbl...
Is there a performance difference between this:

\\\
Dim i As Integer
For i = 0 to myObject.Controls.Count - 1
myObject.Controls(i) = ...
Next
///

and this:

\\\
Dim ctl As Control
For Each ctl In myObject.Controls
ctl = ...
Next
///

Or is For Each just "prettier"?

Thanks,

Eric


Jul 21 '05 #13
The For will beat For Each in sheer performance, due to the overhead of the
enumerator. But, life is not just about performance and the number of cycles
you are saving are unlikely to affect your application, unless you are
running close to scale.

I would do which ever feels most comfortable to you, for maintainability,
rather than tweak out every cycle you can. In all likelihood, the difference
is in the nature of micro-seconds.

--
Gregory A. Beamer
MVP; MCP: +I, SE, SD, DBA

************************************************
Think Outside the Box!
************************************************
<an*******@discussions.microsoft.com> wrote in message
news:OY**************@TK2MSFTNGP11.phx.gbl...
Is there a performance difference between this:

\\\
Dim i As Integer
For i = 0 to myObject.Controls.Count - 1
myObject.Controls(i) = ...
Next
///

and this:

\\\
Dim ctl As Control
For Each ctl In myObject.Controls
ctl = ...
Next
///

Or is For Each just "prettier"?

Thanks,

Eric

Jul 21 '05 #14
The For will beat For Each in sheer performance, due to the overhead of the
enumerator. But, life is not just about performance and the number of cycles
you are saving are unlikely to affect your application, unless you are
running close to scale.

I would do which ever feels most comfortable to you, for maintainability,
rather than tweak out every cycle you can. In all likelihood, the difference
is in the nature of micro-seconds.

--
Gregory A. Beamer
MVP; MCP: +I, SE, SD, DBA

************************************************
Think Outside the Box!
************************************************
<an*******@discussions.microsoft.com> wrote in message
news:OY**************@TK2MSFTNGP11.phx.gbl...
Is there a performance difference between this:

\\\
Dim i As Integer
For i = 0 to myObject.Controls.Count - 1
myObject.Controls(i) = ...
Next
///

and this:

\\\
Dim ctl As Control
For Each ctl In myObject.Controls
ctl = ...
Next
///

Or is For Each just "prettier"?

Thanks,

Eric

Jul 21 '05 #15
"Gabriele G. Ponti" <ggponti.at.hotmail.com> wrote in
news:#C**************@tk2msftngp13.phx.gbl:
Anders Hejlsberg: "Generally my answer is, Always use FOR EACH if you
can, because chances are you will have fewer bugs if you use FOR EACH.
There are just more pitfalls with the regular FOR statement. It is
true that in some cases the FOR statement is more efficient. I think
the vast majority of cases, I don’t think you would ever notice. My
advice would be, always use FOR EACH profile your app. If they turn
out to be your problem, then change them to FOR statements, but I
don’t think you ever will in real code. I think FOR EACH is much more
expressive, and in theory allows us to optimize your code more in the
future. There are certain optimizations that we will do, because we
can tell that you’re going over the entire collection or whatever, and
so we could, at least in theory, generate even better code. I would
highly recommend FOR EACH unless you really do need the index."

http://msdn.microsoft.com/msdntv/epi...s/en/20040624c
sharpah/manifest.xml


The big man has spoken, end of discussion :D
Jul 21 '05 #16
"Gabriele G. Ponti" <ggponti.at.hotmail.com> wrote in
news:#C**************@tk2msftngp13.phx.gbl:
Anders Hejlsberg: "Generally my answer is, Always use FOR EACH if you
can, because chances are you will have fewer bugs if you use FOR EACH.
There are just more pitfalls with the regular FOR statement. It is
true that in some cases the FOR statement is more efficient. I think
the vast majority of cases, I don’t think you would ever notice. My
advice would be, always use FOR EACH profile your app. If they turn
out to be your problem, then change them to FOR statements, but I
don’t think you ever will in real code. I think FOR EACH is much more
expressive, and in theory allows us to optimize your code more in the
future. There are certain optimizations that we will do, because we
can tell that you’re going over the entire collection or whatever, and
so we could, at least in theory, generate even better code. I would
highly recommend FOR EACH unless you really do need the index."

http://msdn.microsoft.com/msdntv/epi...s/en/20040624c
sharpah/manifest.xml


The big man has spoken, end of discussion :D
Jul 21 '05 #17
In case you are interested in actual test results...

A while ago I tested a large number of scenarios to try to gauge performance
difference. Tried Collections and Arrays of both Objects and Structures with and
without collection Keys. My anticipated results were that Arrays would be faster
with Indexes and Collections faster with For Each. While this did sort of hold
true, actual results were MUCH less conclusive than I expected. Turns out
results depend a great deal on the underlying data types, size of underlying
data, quantity of items, etc. Additionally, the various collection types have
significantly different performance characteristics as well, again based on
usage.

Situations occured in each scenario where one or two methods were actually much
faster than the others. Sometimes one method was faster on the early items but
got slower in the later items, while another was consistently slower, but in the
end due to the volume of items ended up being much faster overall. So it is VERY
possible that if you tune your code to get the best performance with a
particular test data set, then you may very well create a situation where it
performs very poorly with other data sets.

So bottom line is as others have suggested.
If you don't need to explicitly reference particular items or need the results
in any particular order, then always use For Each.
Only if you need to add/remove items or are working specifically with Arrays of
value types should you use For Index.

Additionally, if you are working with collections, especially home grown
strongly typed collections, if you need more performance or special behaviour,
you can create your own enumerator to optimize performance.

As Jay pointed out, consider the 80/20 rule. Although due to the nature of
loops, they oftentimes fall into the 20 percent of code that consumes 80 percent
of the time. However, just go with the For Each and only worry about performance
if it becomes a problem.

Finally, go with Gabriele's suggestion, as he would know best ;-)

Gerald

<an*******@discussions.microsoft.com> wrote in message
news:OY**************@TK2MSFTNGP11.phx.gbl...
Is there a performance difference between this:

\\\
Dim i As Integer
For i = 0 to myObject.Controls.Count - 1
myObject.Controls(i) = ...
Next
///

and this:

\\\
Dim ctl As Control
For Each ctl In myObject.Controls
ctl = ...
Next
///

Or is For Each just "prettier"?

Thanks,

Eric

Jul 21 '05 #18
In case you are interested in actual test results...

A while ago I tested a large number of scenarios to try to gauge performance
difference. Tried Collections and Arrays of both Objects and Structures with and
without collection Keys. My anticipated results were that Arrays would be faster
with Indexes and Collections faster with For Each. While this did sort of hold
true, actual results were MUCH less conclusive than I expected. Turns out
results depend a great deal on the underlying data types, size of underlying
data, quantity of items, etc. Additionally, the various collection types have
significantly different performance characteristics as well, again based on
usage.

Situations occured in each scenario where one or two methods were actually much
faster than the others. Sometimes one method was faster on the early items but
got slower in the later items, while another was consistently slower, but in the
end due to the volume of items ended up being much faster overall. So it is VERY
possible that if you tune your code to get the best performance with a
particular test data set, then you may very well create a situation where it
performs very poorly with other data sets.

So bottom line is as others have suggested.
If you don't need to explicitly reference particular items or need the results
in any particular order, then always use For Each.
Only if you need to add/remove items or are working specifically with Arrays of
value types should you use For Index.

Additionally, if you are working with collections, especially home grown
strongly typed collections, if you need more performance or special behaviour,
you can create your own enumerator to optimize performance.

As Jay pointed out, consider the 80/20 rule. Although due to the nature of
loops, they oftentimes fall into the 20 percent of code that consumes 80 percent
of the time. However, just go with the For Each and only worry about performance
if it becomes a problem.

Finally, go with Gabriele's suggestion, as he would know best ;-)

Gerald

<an*******@discussions.microsoft.com> wrote in message
news:OY**************@TK2MSFTNGP11.phx.gbl...
Is there a performance difference between this:

\\\
Dim i As Integer
For i = 0 to myObject.Controls.Count - 1
myObject.Controls(i) = ...
Next
///

and this:

\\\
Dim ctl As Control
For Each ctl In myObject.Controls
ctl = ...
Next
///

Or is For Each just "prettier"?

Thanks,

Eric

Jul 21 '05 #19
I'll chime in here with my longtime gripe.

The foreach implementation is flawed because the container is marked as
readonly during the iteration. This is a crime in my opinion because it is
*normal to effect a change on the container while iterating especially from
a vb point of view.

The work around is impossibly difficult as well especially for beggining to
intermediate developers. Probably the cleanest approach is to use Eric
Gunnerson's iterative container assembly which sits between your application
and the iterated container. The least pleasing approach is to use a try
catch block and swallow the exception. If it takes Eric Gu to impliment a
special strategy to handle this case, it makes my point rather loudly.

This points to a huge flaw in the design and implementation of the foreach
construct. I don't see these guys on trial for that crime so i am
dissapointed.

--
Regards,
Alvin Bruney
[ASP.NET MVP http://mvp.support.microsoft.com/default.aspx]
Got tidbits? Get it here... http://tinyurl.com/27cok
"Cablewizard" <Ca*********@Yahoo.com> wrote in message
news:OS**************@TK2MSFTNGP09.phx.gbl...
In case you are interested in actual test results...

A while ago I tested a large number of scenarios to try to gauge
performance
difference. Tried Collections and Arrays of both Objects and Structures
with and
without collection Keys. My anticipated results were that Arrays would be
faster
with Indexes and Collections faster with For Each. While this did sort of
hold
true, actual results were MUCH less conclusive than I expected. Turns out
results depend a great deal on the underlying data types, size of
underlying
data, quantity of items, etc. Additionally, the various collection types
have
significantly different performance characteristics as well, again based
on
usage.

Situations occured in each scenario where one or two methods were actually
much
faster than the others. Sometimes one method was faster on the early items
but
got slower in the later items, while another was consistently slower, but
in the
end due to the volume of items ended up being much faster overall. So it
is VERY
possible that if you tune your code to get the best performance with a
particular test data set, then you may very well create a situation where
it
performs very poorly with other data sets.

So bottom line is as others have suggested.
If you don't need to explicitly reference particular items or need the
results
in any particular order, then always use For Each.
Only if you need to add/remove items or are working specifically with
Arrays of
value types should you use For Index.

Additionally, if you are working with collections, especially home grown
strongly typed collections, if you need more performance or special
behaviour,
you can create your own enumerator to optimize performance.

As Jay pointed out, consider the 80/20 rule. Although due to the nature of
loops, they oftentimes fall into the 20 percent of code that consumes 80
percent
of the time. However, just go with the For Each and only worry about
performance
if it becomes a problem.

Finally, go with Gabriele's suggestion, as he would know best ;-)

Gerald

<an*******@discussions.microsoft.com> wrote in message
news:OY**************@TK2MSFTNGP11.phx.gbl...
Is there a performance difference between this:

\\\
Dim i As Integer
For i = 0 to myObject.Controls.Count - 1
myObject.Controls(i) = ...
Next
///

and this:

\\\
Dim ctl As Control
For Each ctl In myObject.Controls
ctl = ...
Next
///

Or is For Each just "prettier"?

Thanks,

Eric


Jul 21 '05 #20
I'll chime in here with my longtime gripe.

The foreach implementation is flawed because the container is marked as
readonly during the iteration. This is a crime in my opinion because it is
*normal to effect a change on the container while iterating especially from
a vb point of view.

The work around is impossibly difficult as well especially for beggining to
intermediate developers. Probably the cleanest approach is to use Eric
Gunnerson's iterative container assembly which sits between your application
and the iterated container. The least pleasing approach is to use a try
catch block and swallow the exception. If it takes Eric Gu to impliment a
special strategy to handle this case, it makes my point rather loudly.

This points to a huge flaw in the design and implementation of the foreach
construct. I don't see these guys on trial for that crime so i am
dissapointed.

--
Regards,
Alvin Bruney
[ASP.NET MVP http://mvp.support.microsoft.com/default.aspx]
Got tidbits? Get it here... http://tinyurl.com/27cok
"Cablewizard" <Ca*********@Yahoo.com> wrote in message
news:OS**************@TK2MSFTNGP09.phx.gbl...
In case you are interested in actual test results...

A while ago I tested a large number of scenarios to try to gauge
performance
difference. Tried Collections and Arrays of both Objects and Structures
with and
without collection Keys. My anticipated results were that Arrays would be
faster
with Indexes and Collections faster with For Each. While this did sort of
hold
true, actual results were MUCH less conclusive than I expected. Turns out
results depend a great deal on the underlying data types, size of
underlying
data, quantity of items, etc. Additionally, the various collection types
have
significantly different performance characteristics as well, again based
on
usage.

Situations occured in each scenario where one or two methods were actually
much
faster than the others. Sometimes one method was faster on the early items
but
got slower in the later items, while another was consistently slower, but
in the
end due to the volume of items ended up being much faster overall. So it
is VERY
possible that if you tune your code to get the best performance with a
particular test data set, then you may very well create a situation where
it
performs very poorly with other data sets.

So bottom line is as others have suggested.
If you don't need to explicitly reference particular items or need the
results
in any particular order, then always use For Each.
Only if you need to add/remove items or are working specifically with
Arrays of
value types should you use For Index.

Additionally, if you are working with collections, especially home grown
strongly typed collections, if you need more performance or special
behaviour,
you can create your own enumerator to optimize performance.

As Jay pointed out, consider the 80/20 rule. Although due to the nature of
loops, they oftentimes fall into the 20 percent of code that consumes 80
percent
of the time. However, just go with the For Each and only worry about
performance
if it becomes a problem.

Finally, go with Gabriele's suggestion, as he would know best ;-)

Gerald

<an*******@discussions.microsoft.com> wrote in message
news:OY**************@TK2MSFTNGP11.phx.gbl...
Is there a performance difference between this:

\\\
Dim i As Integer
For i = 0 to myObject.Controls.Count - 1
myObject.Controls(i) = ...
Next
///

and this:

\\\
Dim ctl As Control
For Each ctl In myObject.Controls
ctl = ...
Next
///

Or is For Each just "prettier"?

Thanks,

Eric


Jul 21 '05 #21
Jay B. Harlow [MVP - Outlook] wrote:
Are you asking about the Controls collection specifically or are you asking any collection in general?
As each collection type has its own performance characteristics!
I just meant collections in general, but you make a good point.
I have to ask: Does it really matter which is faster?
Not on my current projects, but it's good to know for the future.
Remember that most programs follow the 80/20 rule (link below) that is 80%
of the execution time of your program is spent in 20% of your code. I will
optimize (worry about performance) the 20% once that 20% has been identified & proven to be a performance problem via profiling (CLR Profiler is one
profiling tool).


I usually follow the 80/20/100 rule, where I optimize the 20% and give the
remaining 80% a good work over anyway to make sure the aggregate is as
efficient as possible. ;-)

Thanks for your reply, Jay.

Eric
Jul 21 '05 #22
Jay B. Harlow [MVP - Outlook] wrote:
Are you asking about the Controls collection specifically or are you asking any collection in general?
As each collection type has its own performance characteristics!
I just meant collections in general, but you make a good point.
I have to ask: Does it really matter which is faster?
Not on my current projects, but it's good to know for the future.
Remember that most programs follow the 80/20 rule (link below) that is 80%
of the execution time of your program is spent in 20% of your code. I will
optimize (worry about performance) the 20% once that 20% has been identified & proven to be a performance problem via profiling (CLR Profiler is one
profiling tool).


I usually follow the 80/20/100 rule, where I optimize the 20% and give the
remaining 80% a good work over anyway to make sure the aggregate is as
efficient as possible. ;-)

Thanks for your reply, Jay.

Eric
Jul 21 '05 #23
Thanks for the post, Nick.

Nick Malik wrote:
I've had folks tell me that 'for' is more efficient than 'foreach' because
of enumerator overhead.


A newbie question...Where do enumerators come into play when using For Each?
In addition , since enumerators can only be of type byte, short, int, or
long, what kind of overhead is introduced?

Thanks again,

Eric
Jul 21 '05 #24
Thanks for the post, Nick.

Nick Malik wrote:
I've had folks tell me that 'for' is more efficient than 'foreach' because
of enumerator overhead.


A newbie question...Where do enumerators come into play when using For Each?
In addition , since enumerators can only be of type byte, short, int, or
long, what kind of overhead is introduced?

Thanks again,

Eric
Jul 21 '05 #25
He is referring to the Enumerator Interface IEnumerator. See below, you can
create your own.

Public Class Class1
Implements IEnumerator

Public ReadOnly Property Current() As Object Implements
System.Collections.IEnumerator.Current
Get

End Get
End Property

Public Function MoveNext() As Boolean Implements
System.Collections.IEnumerator.MoveNext

End Function

Public Sub Reset() Implements System.Collections.IEnumerator.Reset

End Sub
End Class

--

OHM ( Terry Burns )
. . . One-Handed-Man . . .
If U Need My Email ,Ask Me

Time flies when you don't know what you're doing

<an*******@discussions.microsoft.com> wrote in message
news:%2******************@TK2MSFTNGP10.phx.gbl...
Thanks for the post, Nick.

Nick Malik wrote:
I've had folks tell me that 'for' is more efficient than 'foreach' because of enumerator overhead.
A newbie question...Where do enumerators come into play when using For

Each? In addition , since enumerators can only be of type byte, short, int, or
long, what kind of overhead is introduced?

Thanks again,

Eric

Jul 21 '05 #26
He is referring to the Enumerator Interface IEnumerator. See below, you can
create your own.

Public Class Class1
Implements IEnumerator

Public ReadOnly Property Current() As Object Implements
System.Collections.IEnumerator.Current
Get

End Get
End Property

Public Function MoveNext() As Boolean Implements
System.Collections.IEnumerator.MoveNext

End Function

Public Sub Reset() Implements System.Collections.IEnumerator.Reset

End Sub
End Class

--

OHM ( Terry Burns )
. . . One-Handed-Man . . .
If U Need My Email ,Ask Me

Time flies when you don't know what you're doing

<an*******@discussions.microsoft.com> wrote in message
news:%2******************@TK2MSFTNGP10.phx.gbl...
Thanks for the post, Nick.

Nick Malik wrote:
I've had folks tell me that 'for' is more efficient than 'foreach' because of enumerator overhead.
A newbie question...Where do enumerators come into play when using For

Each? In addition , since enumerators can only be of type byte, short, int, or
long, what kind of overhead is introduced?

Thanks again,

Eric

Jul 21 '05 #27
an*******@discussions.microsoft.com wrote:
Thanks for the post, Nick.

Nick Malik wrote:
I've had folks tell me that 'for' is more efficient than 'foreach' because
of enumerator overhead.

A newbie question...Where do enumerators come into play when using For Each?
In addition , since enumerators can only be of type byte, short, int, or
long, what kind of overhead is introduced?


By 'enumerators' Nick was referring to the IEnumerator interface which
is used by the For Each statement to iterate through the collection (For
Each uses and IEnumerable interface to get the IEnumerator).

I think you're confusing the terminology with an enumeration, such as
declared by VB.NET's Enum statement. A completely different animal.

--
mikeb
Jul 21 '05 #28
an*******@discussions.microsoft.com wrote:
Thanks for the post, Nick.

Nick Malik wrote:
I've had folks tell me that 'for' is more efficient than 'foreach' because
of enumerator overhead.

A newbie question...Where do enumerators come into play when using For Each?
In addition , since enumerators can only be of type byte, short, int, or
long, what kind of overhead is introduced?


By 'enumerators' Nick was referring to the IEnumerator interface which
is used by the For Each statement to iterate through the collection (For
Each uses and IEnumerable interface to get the IEnumerator).

I think you're confusing the terminology with an enumeration, such as
declared by VB.NET's Enum statement. A completely different animal.

--
mikeb
Jul 21 '05 #29
"mikeb" wrote:
I think you're confusing the terminology with an enumeration, such as
declared by VB.NET's Enum statement. A completely different animal.


That's exactly what I was doing. Thank you for the clarification, Mike &
Terry.

Eric
Jul 21 '05 #30
"mikeb" wrote:
I think you're confusing the terminology with an enumeration, such as
declared by VB.NET's Enum statement. A completely different animal.


That's exactly what I was doing. Thank you for the clarification, Mike &
Terry.

Eric
Jul 21 '05 #31
I disagree that foreach-construct being readonly is a bad thing. Not
to completely disregard Alvin's gripe, but here's my point of view.

Typically, you use foreach to iterate throught the collection.
Adding/Removing items from the collection during this time puts the
collection into a funny mode that others may not be ready to deal
with, what if you have multiple emunerators? (this is very common for
a nested foreach scenario, yes, that'd be O(n^2) ). I believe in C++
STL libraries you can remove current iterated item, but that opens a
can of worms, you always have to worry whether your current item has
been deleted by another thread.

Also, I highly disagree that the for-construct is faster than
foreach-construct. That is only true when you are talking about ARRAY
collection types. In a linked-list implementation, foreach-construct
O(1) would be faster than for-construct O(n) for iterating through a
collection. The fact that the .NET framework collections are almost
solely based on array types may make the statement correct in 90%+ of
the time, but it is not a correct statement to make generally. And,
besides, I wait for generics!

Typically, hashtables are iterated for the entire key/value pairs,
given that accessing the value of a hashtable is O(1), if you need to
iterate through a hastable, it's easier to iterate through the keys
O(n), and grabbing the value as you go O(1). But I can't think of why
anyone would be iterating through a hashtable except may be as a debug
step to see the contents of the hashtable.

jliu - www.ssw.com.au - johnliu.net
Jul 21 '05 #32
"JohnLiu" <jo******@gmail.com> wrote in message
news:37**************************@posting.google.c om...
I disagree that foreach-construct being readonly is a bad thing. Not
to completely disregard Alvin's gripe, but here's my point of view.

Typically, you use foreach to iterate throught the collection.
Adding/Removing items from the collection during this time puts the
collection into a funny mode that others may not be ready to deal
with, what if you have multiple emunerators? (this is very common for
a nested foreach scenario, yes, that'd be O(n^2) ). I believe in C++
STL libraries you can remove current iterated item, but that opens a
can of worms, you always have to worry whether your current item has
been deleted by another thread.

Also, I highly disagree that the for-construct is faster than
foreach-construct. That is only true when you are talking about ARRAY
collection types. In a linked-list implementation, foreach-construct
O(1) would be faster than for-construct O(n) for iterating through a
collection. The fact that the .NET framework collections are almost
solely based on array types may make the statement correct in 90%+ of
the time, but it is not a correct statement to make generally. And,
besides, I wait for generics!

Typically, hashtables are iterated for the entire key/value pairs,
given that accessing the value of a hashtable is O(1), if you need to
iterate through a hastable, it's easier to iterate through the keys
O(n), and grabbing the value as you go O(1). But I can't think of why
anyone would be iterating through a hashtable except may be as a debug
step to see the contents of the hashtable.

jliu - www.ssw.com.au - johnliu.net


With a For n=start to end step loop you would still have to worry if the
current element has been deleted by either this or another thread since the
start end & step are only evaluated once in the loop.

--
Jonathan Bailey.
Jul 21 '05 #33
On 2004-08-11, Alvin Bruney [MVP] <> wrote:
I'll chime in here with my longtime gripe.

The foreach implementation is flawed because the container is marked as
readonly during the iteration. This is a crime in my opinion because it is
*normal to effect a change on the container while iterating especially from
a vb point of view.


I see your point, but look at it from the implementers' point of view.

Making the container read-only allows for very efficient implementations
of the enumerator, and also makes writing new enumerators fairly simple.
Also, it eliminates a real ambiguity to the For Each statement, does
foreach iterate over the original collection, or over the entire
collection as it changes over time?

Dim i As Integer
For Each o as Object in MyCollection
i += 1
If i = 3 Then
MyCollection.Insert(0, new Object())
MyCollection.Add(New Object())
End If
Next

What would the iteration be in this case? Should both new objects be
iterated, or neither? Or just one? You could think of some reasonable
rules to apply to arrays, but what about things like hashes where
position doesn't have a fixed meaning? And how is the Enumerator
supposed to keep track of what's happening to the collection? Do we add
some kind of event to the IEnumerable interface? If so, that could turn
into a lot of overhead since the enumerator has to check for changes on
each iteration.

For efficiency's sake, maybe we could have two different enumeration
types, one for mutable containers and one for read-only, but then not
only are you complicating the class library tremendously, but calling
conventions can get strange (since only one of them can use For Each).

David
Jul 21 '05 #34
Can you give some sample applications where this statement of you is true?
Although due to the nature of loops, they oftentimes fall into
the 20 percent of code that consumes 80 percent of the time.


It is in my opinion definitly not with applications where is by instance
screen painting or/and dataprocessing.

It is in my opinion definitly true for applications where is image
processing where not the GDI+ encoding is used.

However that is in my opinion surely not the majority of the applications.

So I am curious in what type of other applications stand alone loops can
consume 80% of the time?

Just my thought,

Cor
Jul 21 '05 #35
> Making the container read-only allows for very efficient implementations
of the enumerator, and also makes writing new enumerators fairly simple.
Also, it eliminates a real ambiguity to the For Each statement, does
foreach iterate over the original collection, or over the entire
collection as it changes over time?
I don't disagree with that. very good point indeed. but, the current
approach makes it impossible to perform simple tasks inherent in UI
programming (like removing multiselects in a listbox for instance). Where
such simple tasks are overly complicated, i believe the design should be
reviewed.
For efficiency's sake, maybe we could have two different enumeration
types, one for mutable containers and one for read-only, but then not
only are you complicating the class library tremendously,
I think it is a reasonable approach. It would just be another way to iterate
a container and it shouldn't complicate matters since it could be made to
appear as an overload

but calling conventions can get strange (since only one of them can use For Each).
That's really a design issue which needs to be hashed out in a way to make
this approach feasible.

--
Regards,
Alvin Bruney
[ASP.NET MVP http://mvp.support.microsoft.com/default.aspx]
Got tidbits? Get it here... http://tinyurl.com/27cok
"David" <df*****@woofix.local.dom> wrote in message
news:slrnchmj96.j7j.df*****@woofix.local.dom... On 2004-08-11, Alvin Bruney [MVP] <> wrote:
I'll chime in here with my longtime gripe.

The foreach implementation is flawed because the container is marked as
readonly during the iteration. This is a crime in my opinion because it
is
*normal to effect a change on the container while iterating especially
from
a vb point of view.


I see your point, but look at it from the implementers' point of view.

Making the container read-only allows for very efficient implementations
of the enumerator, and also makes writing new enumerators fairly simple.
Also, it eliminates a real ambiguity to the For Each statement, does
foreach iterate over the original collection, or over the entire
collection as it changes over time?

Dim i As Integer
For Each o as Object in MyCollection
i += 1
If i = 3 Then
MyCollection.Insert(0, new Object())
MyCollection.Add(New Object())
End If
Next

What would the iteration be in this case? Should both new objects be
iterated, or neither? Or just one? You could think of some reasonable
rules to apply to arrays, but what about things like hashes where
position doesn't have a fixed meaning? And how is the Enumerator
supposed to keep track of what's happening to the collection? Do we add
some kind of event to the IEnumerable interface? If so, that could turn
into a lot of overhead since the enumerator has to check for changes on
each iteration.

For efficiency's sake, maybe we could have two different enumeration
types, one for mutable containers and one for read-only, but then not
only are you complicating the class library tremendously, but calling
conventions can get strange (since only one of them can use For Each).

David

Jul 21 '05 #36
> Typically, you use foreach to iterate throught the collection.
Adding/Removing items from the collection during this time puts the
collection into a funny mode that others may not be ready to deal
with, what if you have multiple emunerators?
That is a design and implementation issue, not a programming issue.
Iterating a container which can change during iteration is rightly handled
internally by the construct itself and not by the iterating code so there
should be no funny mode. For instance, what's to stop the internal code from
re-adjusting its contents based on the removal or addition of an item on the
fly? This is very basic functionality available in vb if memory serves me
right.

Multiple enumerators can be handled internally thru synchronization means
and this can all be hidden from the programmer so that she is not aware how
the iterating construct is implemented (good design). I think the choice to
implement this construct as readonly must have come down to efficiency over
functionality. That's the only reason I can think of.

--
Regards,
Alvin Bruney
[ASP.NET MVP http://mvp.support.microsoft.com/default.aspx]
Got tidbits? Get it here... http://tinyurl.com/27cok
"JohnLiu" <jo******@gmail.com> wrote in message
news:37**************************@posting.google.c om...I disagree that foreach-construct being readonly is a bad thing. Not
to completely disregard Alvin's gripe, but here's my point of view.

Typically, you use foreach to iterate throught the collection.
Adding/Removing items from the collection during this time puts the
collection into a funny mode that others may not be ready to deal
with, what if you have multiple emunerators? (this is very common for
a nested foreach scenario, yes, that'd be O(n^2) ). I believe in C++
STL libraries you can remove current iterated item, but that opens a
can of worms, you always have to worry whether your current item has
been deleted by another thread.

Also, I highly disagree that the for-construct is faster than
foreach-construct. That is only true when you are talking about ARRAY
collection types. In a linked-list implementation, foreach-construct
O(1) would be faster than for-construct O(n) for iterating through a
collection. The fact that the .NET framework collections are almost
solely based on array types may make the statement correct in 90%+ of
the time, but it is not a correct statement to make generally. And,
besides, I wait for generics!

Typically, hashtables are iterated for the entire key/value pairs,
given that accessing the value of a hashtable is O(1), if you need to
iterate through a hastable, it's easier to iterate through the keys
O(n), and grabbing the value as you go O(1). But I can't think of why
anyone would be iterating through a hashtable except may be as a debug
step to see the contents of the hashtable.

jliu - www.ssw.com.au - johnliu.net

Jul 21 '05 #37
On Wed, 11 Aug 2004 08:21:47 -0500, an*******@discussions.microsoft.com
wrote:
Is there a performance difference between this:

\\\
Dim i As Integer
For i = 0 to myObject.Controls.Count - 1
myObject.Controls(i) = ...
Next
///

and this:

\\\
Dim ctl As Control
For Each ctl In myObject.Controls
ctl = ...
Next
///

Or is For Each just "prettier"?


They are almost identical when your collection is some sort of an array.
But if the collection is e.g. linked list, then executing .Controls(n) will
cause your app to traverse through n elements - the bigger n is the slower
it will take to find n-th element. Using enumerators (For Each) is
considerably faster here.

Best regards,
Michal Dabrowski
Jul 21 '05 #38
> I don't disagree with that. very good point indeed. but, the current
approach makes it impossible to perform simple tasks inherent in UI
programming (like removing multiselects in a listbox for instance). Where
such simple tasks are overly complicated, i believe the design should be
reviewed.


good point. I like the idea of a collection object that isn't read-only,
seperate from other types of collections. Didn't another thread mention a
bit of code that Ericgu put out that does exactly this?
Jul 21 '05 #39
From this document

http://msdn.microsoft.com/library/de...tchPerfOpt.asp

The performance difference between For and For Each loops does not appear to
be significant.

I hope this helps?

Cor
Jul 21 '05 #40
Alvin,
VB for as long as I can remember (VB1, VB2, VB3, VB5, VB6, VBA) has had
trouble modifying the collection itself when you use For Each. There may
have been one or two specific collections that may have worked, or more then
likely one thought they worked, but really didn't.

The problem is the delete/insert code would need some method of notifying
(an event possible) one or more enumerators that the collection itself was
modified, this notification IMHO for the most part is too expensive to
justify adding it in all cases.

Although I do agree, it would be nice if collections had optional
Enumerators. The "fire house" version of today, which is normally used. Plus
a "safe" version that allowed modifying the collection itself while your
iterating... For example: using For Each on DataTable.Rows is not
modifiable, while using For Each on DataTable.Select is modifiable! By
modifiable means you can call DataRow.Delete or Rows.Add...

Hope this helps
Jay

"Alvin Bruney [MVP]" <vapor at steaming post office> wrote in message
news:uZ**************@TK2MSFTNGP10.phx.gbl...
Typically, you use foreach to iterate throught the collection.
Adding/Removing items from the collection during this time puts the
collection into a funny mode that others may not be ready to deal
with, what if you have multiple emunerators?
That is a design and implementation issue, not a programming issue.
Iterating a container which can change during iteration is rightly handled
internally by the construct itself and not by the iterating code so there
should be no funny mode. For instance, what's to stop the internal code

from re-adjusting its contents based on the removal or addition of an item on the fly? This is very basic functionality available in vb if memory serves me
right.

Multiple enumerators can be handled internally thru synchronization means
and this can all be hidden from the programmer so that she is not aware how the iterating construct is implemented (good design). I think the choice to implement this construct as readonly must have come down to efficiency over functionality. That's the only reason I can think of.

--
Regards,
Alvin Bruney
[ASP.NET MVP http://mvp.support.microsoft.com/default.aspx]
Got tidbits? Get it here... http://tinyurl.com/27cok
"JohnLiu" <jo******@gmail.com> wrote in message
news:37**************************@posting.google.c om...
I disagree that foreach-construct being readonly is a bad thing. Not
to completely disregard Alvin's gripe, but here's my point of view.

Typically, you use foreach to iterate throught the collection.
Adding/Removing items from the collection during this time puts the
collection into a funny mode that others may not be ready to deal
with, what if you have multiple emunerators? (this is very common for
a nested foreach scenario, yes, that'd be O(n^2) ). I believe in C++
STL libraries you can remove current iterated item, but that opens a
can of worms, you always have to worry whether your current item has
been deleted by another thread.

Also, I highly disagree that the for-construct is faster than
foreach-construct. That is only true when you are talking about ARRAY
collection types. In a linked-list implementation, foreach-construct
O(1) would be faster than for-construct O(n) for iterating through a
collection. The fact that the .NET framework collections are almost
solely based on array types may make the statement correct in 90%+ of
the time, but it is not a correct statement to make generally. And,
besides, I wait for generics!

Typically, hashtables are iterated for the entire key/value pairs,
given that accessing the value of a hashtable is O(1), if you need to
iterate through a hastable, it's easier to iterate through the keys
O(n), and grabbing the value as you go O(1). But I can't think of why
anyone would be iterating through a hashtable except may be as a debug
step to see the contents of the hashtable.

jliu - www.ssw.com.au - johnliu.net


Jul 21 '05 #41
I gave this a little bit of thought. I realized that the sort of coding I do is
quite different than what "most" people are doing. I perform mostly engineering
and geospatial analysis. For me, this involves many loops, and loops within
loops. Along with iterating over recordsets countless times. In a literal since,
this is data processing in the extreme, but certainly not like manual data
entry.

However, loops are used to perform some sort of search and/or work on a block of
items. By nature, they can consume a decent portion of the overall processing
time as they are oftentimes the place where much of the actual work is taking
place. Any number of smaller functions may be performed, but potentially it is
performed many if not thousands of times. In this particular thread's example,
the operator is iterating through all the controls in a collection, presumably
to do something with them. I would hazard a guess that if you compared the
overall processor time spent within the scope of the loop, it would be
significant relative to other non-loop functions.

So while what I do may in fact be much different than most others, I still stand
by my statement. Just look at the number of times people want to know how to
keep their GUI responsive while some sort of iterative process is occurring.
Forget for a moment about the design considerations of what is really happening.
Bottom line is that the iterative processing is consuming an amount of time
significant enough to be noticeable to the operator.

Since you asked, and to exemplify Alvin's comments, here is a common occurrence
for me.
(For those who don't what to read a confusing and long-winded example, stop
reading here)
I have a geospatial dataset that contains some number of polygons/regions.
I need to find any overlapping/intersecting regions and degenerate those
intersections into separate regions.
This requires iterating through every element in the dataset and compare it to
every other element.
Additionally, for every potentially intersecting element combination, you must
iterate through every combination of vertices/segments to determine
intersection.
Each combination of intersections could result in the creation of a new region.
Each new region could also intersect with subsequent existing and/or new
regions, which could also generate new regions...
Now, if when existing regions could be degenerated into sub-regions, I could
remove the existing regions from the collection and add the new regions to the
end of the collection, then theoretically I could determine all possible tests
within the scope of 1 top-level For Each loop. But instead, I must be creative
and do something like mark the existing regions for deletion within the master
collection, add the newly created regions to a separate collection. Then perform
the same iteration over the new collection, and potentially create an additional
collection, and so on. Once all combinations are resolved, then I must go back
and iterate through all of the resulting collections to recreate the master
collection. Now in practice, the resulting implementation isn't exactly like
that, but logically it is similar.

So for me, loop performance and implementation is extremely important.

Gerald

"Cor Ligthert" <no**********@planet.nl> wrote in message
news:ei**************@TK2MSFTNGP10.phx.gbl...
Can you give some sample applications where this statement of you is true?
Although due to the nature of loops, they oftentimes fall into
the 20 percent of code that consumes 80 percent of the time.


It is in my opinion definitly not with applications where is by instance
screen painting or/and dataprocessing.

It is in my opinion definitly true for applications where is image
processing where not the GDI+ encoding is used.

However that is in my opinion surely not the majority of the applications.

So I am curious in what type of other applications stand alone loops can
consume 80% of the time?

Just my thought,

Cor

Jul 21 '05 #42
My recollection is that MSFT claimed that .NET, for practical purposes,
eliminated the difference in speed between For and For Each, but I've not
recently tested that assertion.

--
http://www.standards.com/; See Howard Kaikow's web site.
"Nick Malik" <ni*******@hotmail.nospam.com> wrote in message
news:BgqSc.130747$eM2.70902@attbi_s51...
Would you have been happier if Eric has written the question in C#? This is very much as important a question in C# as it is in VB.NET.

foreach (Control ctl in myObject.Controls)
{
// do something useful with 'ctl'

}

I've had folks tell me that 'for' is more efficient than 'foreach' because
of enumerator overhead. For most of my code, however, this is a moot point. Unless the code is in a critical loop, the difference in processing so tiny that the improvement in code readability greatly outweighs the overhead of
allowing .NET to manipulate the enumerator.

--- Nick

"Ignacio Machin ( .NET/ C# MVP )" <ignacio.machin AT dot.state.fl.us> wrote in message news:%2******************@TK2MSFTNGP11.phx.gbl...
<<clipped>>
and finally this is a VB.net question, not a C# one, there is no need to
post it on microsoft.public.dotnet.languages.csharp

<<clipped>>

<an*******@discussions.microsoft.com> wrote in message
news:OY**************@TK2MSFTNGP11.phx.gbl...
Is there a performance difference between this:

\\\
Dim i As Integer
For i = 0 to myObject.Controls.Count - 1
myObject.Controls(i) = ...
Next
///

and this:

\\\
Dim ctl As Control
For Each ctl In myObject.Controls
ctl = ...
Next
///

Or is For Each just "prettier"?

Thanks,

Eric



Jul 21 '05 #43
Gerald,.

I readed it completly, however in my opinion is everything what happens
between processor and memory nowadays extremely fast and that is often
forgotten. (I am not writing this about your situation)

There is a lot of looping in every program even when you try to avoid it. I
think that the code which is created by the ILS will make a lot of loops.

Looping is in my opinion the basic of good programming, and people who think
they can avoid it are mostly making even more code to process or stop the
loop. (By instance by making a test in the loop which cost of course more
than a simple change of a byte).

The performance difference of the methods can be neglected, see for that the
MSDN article I point on in the mainthread of this thread.

I find it mostly overdone how many attention people take to a loop, while
the total througput time will mostly not change.

I wrote mostly. I think that it needs forever and for you specialy to be
done well and that it needs in a lot of situations extra attention. However
when it comes to optimizing the througput, I would first look in most cases
to other parts of the program.

Just my thougth

Cor
I gave this a little bit of thought. I realized that the sort of coding I do is quite different than what "most" people are doing. I perform mostly engineering and geospatial analysis. For me, this involves many loops, and loops within loops. Along with iterating over recordsets countless times. In a literal since, this is data processing in the extreme, but certainly not like manual data
entry.

However, loops are used to perform some sort of search and/or work on a block of items. By nature, they can consume a decent portion of the overall processing time as they are oftentimes the place where much of the actual work is taking place. Any number of smaller functions may be performed, but potentially it is performed many if not thousands of times. In this particular thread's example, the operator is iterating through all the controls in a collection, presumably to do something with them. I would hazard a guess that if you compared the
overall processor time spent within the scope of the loop, it would be
significant relative to other non-loop functions.

So while what I do may in fact be much different than most others, I still stand by my statement. Just look at the number of times people want to know how to keep their GUI responsive while some sort of iterative process is occurring. Forget for a moment about the design considerations of what is really happening. Bottom line is that the iterative processing is consuming an amount of time significant enough to be noticeable to the operator.

Since you asked, and to exemplify Alvin's comments, here is a common occurrence for me.
(For those who don't what to read a confusing and long-winded example, stop reading here)
I have a geospatial dataset that contains some number of polygons/regions.
I need to find any overlapping/intersecting regions and degenerate those
intersections into separate regions.
This requires iterating through every element in the dataset and compare it to every other element.
Additionally, for every potentially intersecting element combination, you must iterate through every combination of vertices/segments to determine
intersection.
Each combination of intersections could result in the creation of a new region. Each new region could also intersect with subsequent existing and/or new
regions, which could also generate new regions...
Now, if when existing regions could be degenerated into sub-regions, I could remove the existing regions from the collection and add the new regions to the end of the collection, then theoretically I could determine all possible tests within the scope of 1 top-level For Each loop. But instead, I must be creative and do something like mark the existing regions for deletion within the master collection, add the newly created regions to a separate collection. Then perform the same iteration over the new collection, and potentially create an additional collection, and so on. Once all combinations are resolved, then I must go back and iterate through all of the resulting collections to recreate the master collection. Now in practice, the resulting implementation isn't exactly like that, but logically it is similar.

So for me, loop performance and implementation is extremely important.

Gerald

"Cor Ligthert" <no**********@planet.nl> wrote in message
news:ei**************@TK2MSFTNGP10.phx.gbl...
Can you give some sample applications where this statement of you is true?
Although due to the nature of loops, they oftentimes fall into
the 20 percent of code that consumes 80 percent of the time.


It is in my opinion definitly not with applications where is by instance
screen painting or/and dataprocessing.

It is in my opinion definitly true for applications where is image
processing where not the GDI+ encoding is used.

However that is in my opinion surely not the majority of the applications.
So I am curious in what type of other applications stand alone loops can
consume 80% of the time?

Just my thought,

Cor


Jul 21 '05 #44
Cor,

If I understand your comments, then I completely agree.
1. Use loops appropriately.
2. Don't loop when it is not necessary.
3. Do loop when appropriate.
4. When you do use a loop, in the end it makes little difference if you use Do,
While, For Index, or For Each. My own testing has shown this to be true.
5. What you do while in the loop is much more important than the loop itself.
Make it as efficient as practical.
6. Just make sure that your overall code design and implementation is done
well/correctly.

In the end, follow Jay's advice. Try to do it right in the first place, and if
you find out it is a problem, then worry about the extra code to try to make it
faster.

Gerald

"Cor Ligthert" <no**********@planet.nl> wrote in message
news:%2****************@TK2MSFTNGP10.phx.gbl...
Gerald,.

I readed it completly, however in my opinion is everything what happens
between processor and memory nowadays extremely fast and that is often
forgotten. (I am not writing this about your situation)

There is a lot of looping in every program even when you try to avoid it. I
think that the code which is created by the ILS will make a lot of loops.

Looping is in my opinion the basic of good programming, and people who think
they can avoid it are mostly making even more code to process or stop the
loop. (By instance by making a test in the loop which cost of course more
than a simple change of a byte).

The performance difference of the methods can be neglected, see for that the
MSDN article I point on in the mainthread of this thread.

I find it mostly overdone how many attention people take to a loop, while
the total througput time will mostly not change.

I wrote mostly. I think that it needs forever and for you specialy to be
done well and that it needs in a lot of situations extra attention. However
when it comes to optimizing the througput, I would first look in most cases
to other parts of the program.

Just my thougth

Cor
I gave this a little bit of thought. I realized that the sort of coding I

do is
quite different than what "most" people are doing. I perform mostly

engineering
and geospatial analysis. For me, this involves many loops, and loops

within
loops. Along with iterating over recordsets countless times. In a literal

since,
this is data processing in the extreme, but certainly not like manual data
entry.

However, loops are used to perform some sort of search and/or work on a

block of
items. By nature, they can consume a decent portion of the overall

processing
time as they are oftentimes the place where much of the actual work is

taking
place. Any number of smaller functions may be performed, but potentially

it is
performed many if not thousands of times. In this particular thread's

example,
the operator is iterating through all the controls in a collection,

presumably
to do something with them. I would hazard a guess that if you compared the
overall processor time spent within the scope of the loop, it would be
significant relative to other non-loop functions.

So while what I do may in fact be much different than most others, I still

stand
by my statement. Just look at the number of times people want to know how

to
keep their GUI responsive while some sort of iterative process is

occurring.
Forget for a moment about the design considerations of what is really

happening.
Bottom line is that the iterative processing is consuming an amount of

time
significant enough to be noticeable to the operator.

Since you asked, and to exemplify Alvin's comments, here is a common

occurrence
for me.
(For those who don't what to read a confusing and long-winded example,

stop
reading here)
I have a geospatial dataset that contains some number of polygons/regions.
I need to find any overlapping/intersecting regions and degenerate those
intersections into separate regions.
This requires iterating through every element in the dataset and compare

it to
every other element.
Additionally, for every potentially intersecting element combination, you

must
iterate through every combination of vertices/segments to determine
intersection.
Each combination of intersections could result in the creation of a new

region.
Each new region could also intersect with subsequent existing and/or new
regions, which could also generate new regions...
Now, if when existing regions could be degenerated into sub-regions, I

could
remove the existing regions from the collection and add the new regions to

the
end of the collection, then theoretically I could determine all possible

tests
within the scope of 1 top-level For Each loop. But instead, I must be

creative
and do something like mark the existing regions for deletion within the

master
collection, add the newly created regions to a separate collection. Then

perform
the same iteration over the new collection, and potentially create an

additional
collection, and so on. Once all combinations are resolved, then I must go

back
and iterate through all of the resulting collections to recreate the

master
collection. Now in practice, the resulting implementation isn't exactly

like
that, but logically it is similar.

So for me, loop performance and implementation is extremely important.

Gerald

"Cor Ligthert" <no**********@planet.nl> wrote in message
news:ei**************@TK2MSFTNGP10.phx.gbl...
Can you give some sample applications where this statement of you is true?
> Although due to the nature of loops, they oftentimes fall into
> the 20 percent of code that consumes 80 percent of the time.

It is in my opinion definitly not with applications where is by instance
screen painting or/and dataprocessing.

It is in my opinion definitly true for applications where is image
processing where not the GDI+ encoding is used.

However that is in my opinion surely not the majority of the applications.
So I am curious in what type of other applications stand alone loops can
consume 80% of the time?

Just my thought,

Cor



Jul 21 '05 #45
On 2004-08-12, Alvin Bruney [MVP] <> wrote:
David Wrote
Also, it eliminates a real ambiguity to the For Each statement, does
foreach iterate over the original collection, or over the entire
collection as it changes over time?
I don't disagree with that. very good point indeed. but, the current
approach makes it impossible to perform simple tasks inherent in UI
programming (like removing multiselects in a listbox for instance).


Why is that difficult?

For each item as Object In New ArrayList(ListBox1.SelectedItems)
ListBox1.Items.Remove(item)
Next
Where
such simple tasks are overly complicated, i believe the design should be
reviewed.


Well, the case where you want to iterate over the original items while
mutating the collection is always trivial, just copy the references and
enumerate the copy. I don't see why we'd need a different enumerator
for that. And the case where you want to alter the iteration based on
actions during the iteration is

a) ambiguous and generally domain-specific, so not suitable for the CLR; and
b) probably a really bad idea.

There's a deeper issue too, which I won't really get into. But IMHO
there's a tendency for .Net developers to overuse dumb collections, and
put a lot of logic into various enumerations in controller classes that
should really be handled by the collection class itself. It's hard to
avoid doing that, but I think it's a bad habit and I'm not sure I want
to see more language features that encourage it.

Jul 21 '05 #46
> For each item as Object In New ArrayList(ListBox1.SelectedItems)
ListBox1.Items.Remove(item)
Next
maybe if you spent 10 seconds testing your code BEFORE you posted it, you
would find out what all this discussion is about!
Well, the case where you want to iterate over the original items while
mutating the collection is always trivial, just copy the references and
enumerate the copy.
for a tutorial on how references work have a look at this most excellent
article:
http://www.dotnet247.com/247referenc...box.com/~skeet

There's a deeper issue too, which I won't really get into. But IMHO
there's a tendency for .Net developers to overuse dumb collections, and
put a lot of logic into various enumerations in controller classes that
should really be handled by the collection class itself. It's hard to
avoid doing that, but I think it's a bad habit and I'm not sure I want
to see more language features that encourage it.
I have no clue as to what you are trying to say. This apparently has no
bearing on the previous threads.
Or maybe you may want to try again AFTER reading the relevant threads.

--
Regards,
Alvin Bruney
[ASP.NET MVP http://mvp.support.microsoft.com/default.aspx]
Got tidbits? Get it here... http://tinyurl.com/27cok
"David" <df*****@woofix.local.dom> wrote in message
news:slrncho6th.knv.df*****@woofix.local.dom... On 2004-08-12, Alvin Bruney [MVP] <> wrote:
David Wrote
Also, it eliminates a real ambiguity to the For Each statement, does
foreach iterate over the original collection, or over the entire
collection as it changes over time?


I don't disagree with that. very good point indeed. but, the current
approach makes it impossible to perform simple tasks inherent in UI
programming (like removing multiselects in a listbox for instance).


Why is that difficult?

For each item as Object In New ArrayList(ListBox1.SelectedItems)
ListBox1.Items.Remove(item)
Next
Where
such simple tasks are overly complicated, i believe the design should be
reviewed.


Well, the case where you want to iterate over the original items while
mutating the collection is always trivial, just copy the references and
enumerate the copy. I don't see why we'd need a different enumerator
for that. And the case where you want to alter the iteration based on
actions during the iteration is

a) ambiguous and generally domain-specific, so not suitable for the CLR;
and
b) probably a really bad idea.

There's a deeper issue too, which I won't really get into. But IMHO
there's a tendency for .Net developers to overuse dumb collections, and
put a lot of logic into various enumerations in controller classes that
should really be handled by the collection class itself. It's hard to
avoid doing that, but I think it's a bad habit and I'm not sure I want
to see more language features that encourage it.

Jul 21 '05 #47
Hello David,

Your response certainly has a lot of emotion.

too bad it isn't coherent.

(that's my long winded way of say "what the heck are you talking about?")
Well, the case where you want to iterate over the original items while
mutating the collection is always trivial, just copy the references and
enumerate the copy.
And this is efficient how? If you want to do that, then do that... but
don't make my code pay for the overhead of that functionality because you
may want to do that once out of ten-thousand calls.

I don't see why we'd need a different enumerator for that.
You just described a different enumerator

There's a deeper issue too, which I won't really get into.
I was starting to hope, but then...
But IMHO
Oh darn, you went into it...
there's a tendency for .Net
developers to overuse dumb collections, and
put a lot of logic into various enumerations in controller classes that
should really be handled by the collection class itself.
It's hard to avoid doing that, but I think it's a bad habit and I'm not sure I want to see more language features that encourage it.


So collections are a bad idea and we should all create "smart" classes that
wrap our types with logic, like how to do a sorted list, or how to do a
stack... (ignoring the debugged code for this that is in the CLR... that's
right, if you didn't write it, it isn't any good... Sorry... I forgot).

I promise not to get you started on generics.
Jul 21 '05 #48
"Cor Ligthert" <no**********@planet.nl> wrote in message news:<Ow**************@TK2MSFTNGP12.phx.gbl>...
From this document

http://msdn.microsoft.com/library/de...tchPerfOpt.asp

The performance difference between For and For Each loops does not appear to
be significant.

I hope this helps?

Cor


It is not significant if you are working with Array-based collection
(almost all .NET collections are array based). Because an enumerator
in an array collection is pretty much just an index on a particular
position.

If you are working with Linked List-based collections, performance
with enumerators O(1) are vastly superior than index-based O(n), over
large collections this is easily visible.

jliu - www.ssw.com.au - johnliu.net
Jul 21 '05 #49
On 2004-08-13, Nick Malik <ni*******@hotmail.nospam.com> wrote:
Hello David,

Your response certainly has a lot of emotion.
It really doesn't.
too bad it isn't coherent.

(that's my long winded way of say "what the heck are you talking about?")
Well, the case where you want to iterate over the original items while
mutating the collection is always trivial, just copy the references and
enumerate the copy.


And this is efficient how? If you want to do that, then do that... but
don't make my code pay for the overhead of that functionality because you
may want to do that once out of ten-thousand calls.


It's not particularly efficient, but you only have two choices if you
want to mutate the collection: either copy the references or keep track
of the changes to the collection (with an event or something). Neither
is as efficient as keeping the collection read-only, and which is more
efficient depends entirely on what you're doing during the enumeration.

And if you want to enumerate over only the original items in the
collection, copying the references is your only reasonable choice. From
the point of view of efficiency, it doesn't matter whether you do this
explicitly or if .Net does it implicitly behind the scenes with a new
enumerator type.

For the example given, removing selected items from a ListBox, copying
the references is going to take a trivial amount of time compared to the
time it takes to redraw the ListBox.
I don't see why we'd need a different enumerator for that.


You just described a different enumerator


And I didn't need a new enumerator type or new language construct to
achieve the effect.
There's a deeper issue too, which I won't really get into.


I was starting to hope, but then...


Heh, somehow I doubt you were. This last part was putting a large
design issue into a very short paragraph, so it's understandable that
it's been misunderstood.
But IMHO


Oh darn, you went into it...
there's a tendency for .Net
developers to overuse dumb collections, and
put a lot of logic into various enumerations in controller classes that
should really be handled by the collection class itself.
It's hard to avoid doing that, but I think it's a bad habit and I'm not

sure I want
to see more language features that encourage it.


So collections are a bad idea and we should all create "smart" classes that
wrap our types with logic, like how to do a sorted list, or how to do a
stack... (ignoring the debugged code for this that is in the CLR... that's
right, if you didn't write it, it isn't any good... Sorry... I forgot).


Well, now who's being emotional?

Obviously we should be using the collections, but IMO they should be
used as base classes or through composition much more often than they
are now. For example, I tend to use typed collections much more often
than the generic ArrayList, etc., and .Net gives me a rich set of tools
to create them with. In my experience, I'm not unusual at all in doing
this.

But once you begin to think of collections as being not just a dumb set
of objects, but a class representing of group of specific types, then
the next step is to start treating it like a full-fledged class in its
own right. And in classic OO terms, we shouldn't be iterating over the
privates of another class to get something done, we should be sending a
message to the class to ask it to perform the action.

Alvin's example is right on target. The only unique property a
ListBoxItemCollection holds onto is whether an item is selected,
removing those items should be a public method of the collection class
(or possibly of the ListBox itself).

Jul 21 '05 #50

This thread has been closed and replies have been disabled. Please start a new discussion.

Similar topics

4
by: Jean-Christophe Michel | last post by:
Hi, I have a stylesheet with an accumulator like this <xsl:param name="new-x"> <xsl:for-each select="anode"> <mynode> <xsl:attribute name="attr"> <xsl:value select="anode/atag" />...
2
by: matatu | last post by:
Hi to everybody, It's possible to exit from a cycle for-each? I have this code: 1: <xsl:for-each select="field"> 2: <xsl:variable name="tagfield" select="@tag"/> 3: <xsl:variable...
3
by: deko | last post by:
Problem: why doesn't this With block work? is it possible to have a For Each loop within a With block? With objWord .Documents.Add Template:=strTemplate, NewTemplate:=False, DocumentType:=0...
8
by: Floris van Haaster | last post by:
Hi All! I have a question, i have a web application and I store some member information in a variable i declared in a module like: Public some_info_variable as string in module1.vb But...
13
by: tshad | last post by:
Is there a way to run a script or function on entry of each page without having to put a call in each page? Periodically, I find something I want to do each time a page is entered and I have to...
6
by: Michael D. Ober | last post by:
In VB 6, the loop iterator v in the following code must be a variant. dim v as variant dim c as new collection for each v in collection ... next v What is the general translation in VB 7.1...
6
by: Neal | last post by:
Hi All, I used an article on XSLT and XML and creating a TOC written on the MSDN CodeCorner. ms-help://MS.VSCC.2003/MS.MSDNQTR.2003FEB.1033/dncodecorn/html/corner042699.htm However, it did'nt...
9
by: xmlhelp | last post by:
stuff.XSL: <?xml version="1.0" encoding="ISO-8859-1"?> <xsl:stylesheet version="1.0" xmlns:xsl="http://www.w3.org/1999/XSL/Transform"> <xsl:param name="uid"/> <xsl:template match="/"> parameter...
17
by: The Frog | last post by:
Hello everyone, I am working on an application that can build database objects in MS Access from text files. I suppose you could call it a backup and restore type routine. Accessing the...
0
ADezii
by: ADezii | last post by:
If you want to visit each item in an Array, you have two alternatives: Use a For Each..Next loop, using a Variant to retrieve each value in turn. Use a For...Next loop, looping from the Lower...
0
by: taylorcarr | last post by:
A Canon printer is a smart device known for being advanced, efficient, and reliable. It is designed for home, office, and hybrid workspace use and can also be used for a variety of purposes. However,...
0
by: Charles Arthur | last post by:
How do i turn on java script on a villaon, callus and itel keypad mobile phone
0
by: ryjfgjl | last post by:
If we have dozens or hundreds of excel to import into the database, if we use the excel import function provided by database editors such as navicat, it will be extremely tedious and time-consuming...
0
by: ryjfgjl | last post by:
In our work, we often receive Excel tables with data in the same format. If we want to analyze these data, it can be difficult to analyze them because the data is spread across multiple Excel files...
0
by: emmanuelkatto | last post by:
Hi All, I am Emmanuel katto from Uganda. I want to ask what challenges you've faced while migrating a website to cloud. Please let me know. Thanks! Emmanuel
0
by: Hystou | last post by:
There are some requirements for setting up RAID: 1. The motherboard and BIOS support RAID configuration. 2. The motherboard has 2 or more available SATA protocol SSD/HDD slots (including MSATA, M.2...
0
marktang
by: marktang | last post by:
ONU (Optical Network Unit) is one of the key components for providing high-speed Internet services. Its primary function is to act as an endpoint device located at the user's premises. However,...
0
by: Hystou | last post by:
Most computers default to English, but sometimes we require a different language, especially when relocating. Forgot to request a specific language before your computer shipped? No problem! You can...
0
Oralloy
by: Oralloy | last post by:
Hello folks, I am unable to find appropriate documentation on the type promotion of bit-fields when using the generalised comparison operator "<=>". The problem is that using the GNU compilers,...

By using Bytes.com and it's services, you agree to our Privacy Policy and Terms of Use.

To disable or enable advertisements and analytics tracking please visit the manage ads & tracking page.