470,568 Members | 1,535 Online
Bytes | Developer Community
New Post

Home Posts Topics Members FAQ

Post your question to a community of 470,568 developers. It's quick & easy.

Benchmarking C#, what is the most efficient way of running console code?

I'm doing some benchmarking tests to compare Microsoft's CLR against
Mono's CLR. I could use some suggestions for how to objectively
compare the code. To my surprise the few tests I've run so far have
had Mono running quite a bit faster than Vanilla .NET on my Windows XP
installation, which has me wondering if I'm doing something that's
sandbagging .NET. It's all console code, for Microsoft I'm just
running the .exe file from the command line, and for Mono I'm using
the mono command line to run the same .exe. The .exe file was
generated by Visual Studio 2005 on release mode. Is there anything
further left to do to optimize the running of the exe file under .NET?
Thanks for the help.

Mar 10 '07 #1
7 2295
On Mar 10, 1:01 am, "Wisgary" <wisg...@gmail.comwrote:
I'm doing some benchmarking tests to compare Microsoft's CLR against
Mono's CLR. I could use some suggestions for how to objectively
compare the code. To my surprise the few tests I've run so far have
had Mono running quite a bit faster than Vanilla .NET on my Windows XP
installation, which has me wondering if I'm doing something that's
sandbagging .NET. It's all console code, for Microsoft I'm just
running the .exe file from the command line, and for Mono I'm using
the mono command line to run the same .exe. The .exe file was
generated by Visual Studio 2005 on release mode. Is there anything
further left to do to optimize the running of the exe file under .NET?
Thanks for the help.
Oh yeah, to measure the time between bits of test code I'm using two
DateTime.Nows, one before the test code, one after and then
subtracting and printing the resulting TimeSpan. If there's a better
way to measure time, suggestions for that would be good too. I'm also
wondering if the version of Microsoft's .NET I'm running matters when
it comes to speed if I don't use any library code. I'm not using any
except for measuring the time it takes for a test to run.

Mar 10 '07 #2
Wisgary wrote:
Oh yeah, to measure the time between bits of test code I'm using two
DateTime.Nows, one before the test code, one after and then
subtracting and printing the resulting TimeSpan. If there's a better
way to measure time, suggestions for that would be good too.
Stopwatch w = Stopwatch.StartNew();
// code to time
elapsed time is: w.ElapsedTicks / (double) Stopwatch.Frequency

DateTime.Now has terrible resolution for short intervals. If you use
large loops (i.e. that take several seconds to run), then it won't make
much difference.
I'm also
wondering if the version of Microsoft's .NET I'm running matters when
it comes to speed if I don't use any library code. I'm not using any
except for measuring the time it takes for a test to run.
Sure it matters, why not? Use .NET 2.0 for Stopwatch, it's not in
earlier versions.

-- Barry

--
http://barrkel.blogspot.com/
Mar 10 '07 #3
Thanks for the suggestions, I switched to the Stopwatch and now my
results actually do seem to be more reasonable. Maybe there was a big
difference in the Mono/Microsoft implementations of DateTime that was
lagging everything. Mono is still faster, but not by much, at least on
Windows, on Linux the same program that took 23 seconds to run on
Microsoft's CLR, and 21.6 seconds on Windows Mono actually took around
19.3 seconds, nearly 4 seconds faster. (The test involved really big
linked list sorting). To be fair, these are just pre-tests, when I ran
it on Windows I had a number of applications open while Linux only had
xterm open, so they aren't really representative of anything yet. I'll
try it out cleanly on windows later today. Thanks again for the help
and any other people that can help me out with some suggestions here
is welcome to do so!

On Mar 10, 2:31 am, Barry Kelly <barry.j.ke...@gmail.comwrote:
Wisgary wrote:
Oh yeah, to measure the time between bits of test code I'm using two
DateTime.Nows, one before the test code, one after and then
subtracting and printing the resulting TimeSpan. If there's a better
way to measure time, suggestions for that would be good too.

Stopwatch w = Stopwatch.StartNew();
// code to time
elapsed time is: w.ElapsedTicks / (double) Stopwatch.Frequency

DateTime.Now has terrible resolution for short intervals. If you use
large loops (i.e. that take several seconds to run), then it won't make
much difference.
I'm also
wondering if the version of Microsoft's .NET I'm running matters when
it comes to speed if I don't use any library code. I'm not using any
except for measuring the time it takes for a test to run.

Sure it matters, why not? Use .NET 2.0 for Stopwatch, it's not in
earlier versions.

-- Barry

--http://barrkel.blogspot.com/

Mar 10 '07 #4
You've got to be aware that comparing Mono & the Microsoft .NET CLR isn't
apples to apples.

For example, the Mono Garbage Collector doesn't do heap compaction (at least
not in the versions that I've used). There are quite a bit examples of
things like that that aren't done - so even if it appears faster in some
cases, there are a number of cases where isn't just not yet suitable to use.

In general, for the stuff we've built and run atop of Mono, performance was
"good enough". Long term stability of a process wasn't really there though,
and the frequent versioning that Mono did really caused us nighmares. We
would have code that worked great in one version, and would completly fail
in others.

--
Chris Mullins, MCSD.NET, MCPD:Enterprise, Microsoft C# MVP
http://www.coversant.com/blogs/cmullins

"Wisgary" <wi*****@gmail.comwrote in message
news:11*********************@p10g2000cwp.googlegro ups.com...
Thanks for the suggestions, I switched to the Stopwatch and now my
results actually do seem to be more reasonable. Maybe there was a big
difference in the Mono/Microsoft implementations of DateTime that was
lagging everything. Mono is still faster, but not by much, at least on
Windows, on Linux the same program that took 23 seconds to run on
Microsoft's CLR, and 21.6 seconds on Windows Mono actually took around
19.3 seconds, nearly 4 seconds faster. (The test involved really big
linked list sorting). To be fair, these are just pre-tests, when I ran
it on Windows I had a number of applications open while Linux only had
xterm open, so they aren't really representative of anything yet. I'll
try it out cleanly on windows later today. Thanks again for the help
and any other people that can help me out with some suggestions here
is welcome to do so!

On Mar 10, 2:31 am, Barry Kelly <barry.j.ke...@gmail.comwrote:
>Wisgary wrote:
Oh yeah, to measure the time between bits of test code I'm using two
DateTime.Nows, one before the test code, one after and then
subtracting and printing the resulting TimeSpan. If there's a better
way to measure time, suggestions for that would be good too.

Stopwatch w = Stopwatch.StartNew();
// code to time
elapsed time is: w.ElapsedTicks / (double) Stopwatch.Frequency

DateTime.Now has terrible resolution for short intervals. If you use
large loops (i.e. that take several seconds to run), then it won't make
much difference.
I'm also
wondering if the version of Microsoft's .NET I'm running matters when
it comes to speed if I don't use any library code. I'm not using any
except for measuring the time it takes for a test to run.

Sure it matters, why not? Use .NET 2.0 for Stopwatch, it's not in
earlier versions.

-- Barry

--http://barrkel.blogspot.com/


Mar 10 '07 #5
On Mar 10, 4:08 pm, "Chris Mullins [MVP]" <cmull...@yahoo.comwrote:
You've got to be aware that comparing Mono & the Microsoft .NET CLR isn't
apples to apples.

For example, the Mono Garbage Collector doesn't do heap compaction (at least
not in the versions that I've used). There are quite a bit examples of
things like that that aren't done - so even if it appears faster in some
cases, there are a number of cases where isn't just not yet suitable to use.

In general, for the stuff we've built and run atop of Mono, performance was
"good enough". Long term stability of a process wasn't really there though,
and the frequent versioning that Mono did really caused us nighmares. We
would have code that worked great in one version, and would completly fail
in others.

--
Chris Mullins, MCSD.NET, MCPD:Enterprise, Microsoft C# MVPhttp://www.coversant.com/blogs/cmullins

"Wisgary" <wisg...@gmail.comwrote in message

news:11*********************@p10g2000cwp.googlegro ups.com...
Thanks for the suggestions, I switched to the Stopwatch and now my
results actually do seem to be more reasonable. Maybe there was a big
difference in the Mono/Microsoft implementations of DateTime that was
lagging everything. Mono is still faster, but not by much, at least on
Windows, on Linux the same program that took 23 seconds to run on
Microsoft's CLR, and 21.6 seconds on Windows Mono actually took around
19.3 seconds, nearly 4 seconds faster. (The test involved really big
linked list sorting). To be fair, these are just pre-tests, when I ran
it on Windows I had a number of applications open while Linux only had
xterm open, so they aren't really representative of anything yet. I'll
try it out cleanly on windows later today. Thanks again for the help
and any other people that can help me out with some suggestions here
is welcome to do so!
On Mar 10, 2:31 am, Barry Kelly <barry.j.ke...@gmail.comwrote:
Wisgary wrote:
Oh yeah, to measure the time between bits of test code I'm using two
DateTime.Nows, one before the test code, one after and then
subtracting and printing the resulting TimeSpan. If there's a better
way to measure time, suggestions for that would be good too.
Stopwatch w = Stopwatch.StartNew();
// code to time
elapsed time is: w.ElapsedTicks / (double) Stopwatch.Frequency
DateTime.Now has terrible resolution for short intervals. If you use
large loops (i.e. that take several seconds to run), then it won't make
much difference.
I'm also
wondering if the version of Microsoft's .NET I'm running matters when
it comes to speed if I don't use any library code. I'm not using any
except for measuring the time it takes for a test to run.
Sure it matters, why not? Use .NET 2.0 for Stopwatch, it's not in
earlier versions.
-- Barry
--http://barrkel.blogspot.com/
Yeah, I'm not sure if this is related to the heap compaction absence
problem you mentioned, but when I scaled up the test and added a few
million extra items to my linked list, .NET worked fine but Mono
started blurting out heap errors and crashing. Had to tone it down a
bit so it would run at all.

Mar 10 '07 #6
Wisgary wrote:
I'm doing some benchmarking tests to compare Microsoft's CLR against
Mono's CLR. I could use some suggestions for how to objectively
compare the code. To my surprise the few tests I've run so far have
had Mono running quite a bit faster than Vanilla .NET on my Windows XP
installation, which has me wondering if I'm doing something that's
sandbagging .NET. It's all console code, for Microsoft I'm just
running the .exe file from the command line, and for Mono I'm using
the mono command line to run the same .exe. The .exe file was
generated by Visual Studio 2005 on release mode. Is there anything
further left to do to optimize the running of the exe file under .NET?
That surprises me.

In all my tests MS .NET has been significantly faster than Mono.

But as usual with benchmarks then specific code used for the benchmark
van produce almost any result. If you post the code, then you may get
a more specific answer.

Arne
Mar 17 '07 #7
Barry Kelly wrote:
Wisgary wrote:
>Oh yeah, to measure the time between bits of test code I'm using two
DateTime.Nows, one before the test code, one after and then
subtracting and printing the resulting TimeSpan. If there's a better
way to measure time, suggestions for that would be good too.

Stopwatch w = Stopwatch.StartNew();
// code to time
elapsed time is: w.ElapsedTicks / (double) Stopwatch.Frequency

DateTime.Now has terrible resolution for short intervals. If you use
large loops (i.e. that take several seconds to run), then it won't make
much difference.
The test should take many seconds to be good anyway ...

Arne

Mar 17 '07 #8

This discussion thread is closed

Replies have been disabled for this discussion.

Similar topics

3 posts views Thread by Andy Dingley | last post: by
reply views Thread by Jerry Coffin | last post: by
8 posts views Thread by Jerry Coffin | last post: by
4 posts views Thread by Robert Vasquez | last post: by
10 posts views Thread by Michel Rouzic | last post: by
1 post views Thread by livre | last post: by
By using this site, you agree to our Privacy Policy and Terms of Use.