I ran the following Java code on a Linux server:
while (true) {
int a = 1+2;
}
It caused one CPU core to reach 100% usage. I'm confused about this, because I learnt that CPUs deal with tasks by time splitting, meaning that the CPU will do one task in one time slot (CPU time range scheduler). If there are 10 time slots, the while true task should have at most use 10% CPU usage, because the other 90% would be assigned to other tasks. So why is it 100%?