I somehow get the following timestamp on my program. I understand if there's IO involved, real time can be larger than the sum of user time and system time, but how do you explain this when user time alone is larger than real time
real 0m8.512s
user 0m8.737s
sys 0m1.956s