2

There are quite a few questions here on StackOverflow explaining how to calculate process CPU utilization (e.g. this). What I don't understand is how frequency scaling affects CPU utilization calculations. It seems to me, if I follow the formula recommended (and I actually also checked top's source code and it does the same), a process running on a CPU at the lowest frequency and a process running at the highest CPU frequency for the same duration will yield identical utilization rate. But this doesn't feel right to me, especially when CPU utilization is used as a stand-in to compare power consumption.

What am I missing?

Dmitry B.
  • 9,107
  • 3
  • 43
  • 64

0 Answers0