0

This program is basically for count number of digits of an integer number. But when I give input digits without start with zero( '100' ) then it works fine but if I give input with starts zero( '00100' ) then it gives wrong output.

example:

1) input: 100
output: 3. it is expected output.

2) input: 00100
output: 2. it is an unexpected output.

because it takes '00100' integer number as an integer number '64' I don't know why?

    #include <stdio.h>
    #include <stdlib.h>

    int Digit_count(int);
    int main(void)
    {
        int a;
        a = Digit_count(00100);
        printf("\nNumber of Digits = %d\n",a);

        return 0;
    }


/// function for count number of digits

int Digit_count(int a)
{
    int p,digit;
    p=a,digit=0;

    printf("\n%d\n",p);  // **This line code prints Output: 64**

    while(p!=0)
    {
        p = p/10;
        digit++;
    }

  return digit;
}

1 Answers1

1

In the C programming language there are several different numerical systems defined for programmers to use.

Decimal numbers are the standard base-10 numbers we're used to, and they're what you're expecting your program to use, but in the C11 standard here on page 62, they define integer constants that begin with 0 as octal constants, which means base 8.

So when you pass 100 to your program it interprets that as decimal 100, which is the same as octal 144, and when you type 0100 it interprets it as octal 100, which is the same as decimal 64.

I hope that helps!

Willis Hershey
  • 1,520
  • 4
  • 22