This not so much a programming question as it is a basic arithmetic one, you mention the while
s confuse you but you have to look at the code as a whole:
//...
temp = num;
Stores the original input to be used ahead.
while (temp) {
temp = temp/10;
factor = factor*10;
}
This cycle will divide the value temp
by 10
until the result is 0
at which point it will evaluate to false and exit the loop, keep in mind that integer division in C truncates de result, e.g. 2/3 ~ 0.67 = 0
, the factor is simply accumulates the number of divisions needed to get to 0
, in this case 5 times so 100000
.
while (factor>1) {
factor = factor/10;
printf("%d ", num/factor);
num = num % factor;
}
Here the same principle happens, the first loop divides the number by the factor, so 12345 / 10000 = 1.2345
but since integer division truncates the decimal part you get 1
, next the modulus operator(%
) is applied, and what it does is to return the decimal part of a division, so 2345
, next loop 2345 / 1000 = 2.345 = 2
, decimal part 345
, and so on.
Note that, since you know the number of digits in the original input, you wouldn't even need the first loop, you could simple harcode the factor:
#include <stdio.h>
int main() {
int num, factor;
printf("Enter a 5 digit number: ");
scanf("%d", &num);
factor= 100000; //<--
while (factor > 1) {
factor = factor / 10;
printf("%d ", num / factor);
num = num % factor;
}
return 0;
}