I am currently doing some CS research that requires to collect some "web
page retrieval latency" statistical data. Basically I have a list of URLs,
and I want to measure the retrieval time of those URLs. So far, it seems
very simple to do :
<code>
DateTime beforeCall = DateTime.Now;
myHttpRequest.GetResponse();
DateTime afterCall = DateTime.Now;
</code>
At the end, the delays are measured in millesecond, and my code is working
(both on .Net and with Mono). What is bothering me are the results : 80% of
the millisecond time measures with .Net ends up with zeros (like 80ms, 90ms
....). With Mono, the results are more "randomly" distributed.
My question is "Why 80% of the time do I get a rounded result with .Net ?".
I would understand a 100% case (always rounding), a 10% case (uniform
probability), but this 80% is very weird.
Do anyone has an idea about it ?
Thanks,
Joannes