I don't want to know what the CPU utilization is right now. I want to
get the average utilization over the last, for example, hour. So I
came up with a method where I would get a Process object representing
the "Idle" process. I would figure out what percentage of time the
idle process has been running since my function last was called. Then
the average CPU utilization would be 100 - average idle percentage. I
figure out the average idle percentage by dividing the difference in
TotalProcessorTime by the difference in total time since the function
was called last. Here's the code
DateTime _oldTime;
TimeSpan _oldCPUUsage;
private void CPUUsage()
{
//Find out how much processor has been idle. Usage is 100% -
idle%.
DateTime newTime = DateTime.Now;
TimeSpan newCPUUsage = Process.GetProcessesByName
("Idle")[0].TotalProcessorTime;
double idlePercent =
(newCPUUsage.Subtract(_oldCPUUsage).TotalHours /
newTime.Subtract(_oldTime).TotalHours) * 100;
Debug.WriteLine((100.0 - idlePercent).ToString());
_oldCPUUsage = newCPUUsage;
_oldTime = newTime;
}
It seems to work great some of the time, but not all the time.
Sometimes my CPU usage (100.0 - idlePercent) ends up being negative or
really big (> 100000). So there's got to be something wrong in my
algorithm, but I don't know what. My thinking is that the "idle"
process might not always be an accurate way to get what I think it's
giving me. Any ideas?
Or is there a better way to do what I am trying to do?
thanks
John