35

I somehow get the following timestamp on my program. I understand if there's IO involved, real time can be larger than the sum of user time and system time, but how do you explain this when user time alone is larger than real time

real    0m8.512s
user    0m8.737s
sys     0m1.956s
Daniel
  • 1,484
  • 5
  • 24
  • 42

2 Answers2

78

The program is probably using multiple cores at some point. User time is summed over the cores that have been used, so e.g. using 100% of two cores for 1s makes for 2s user time.

Fred Foo
  • 355,277
  • 75
  • 744
  • 836
1

Your original post had user time not larger than real time. Your user and sys time together are larger than real time, but that is possible as explained in this entry

Community
  • 1
  • 1
vinnief
  • 742
  • 10
  • 13