5

Possible Duplicate:
Difference between format specifiers %i and %d in printf

I just checked the reference, it says both of them indicate signed integer. I thought there must be some difference

Community
  • 1
  • 1
mko
  • 21,334
  • 49
  • 130
  • 191
  • Probably a bit of history there; I wouldn't know. – chris Jul 22 '12 at 03:54
  • 1
    there's the SO duplicate mentioned by @paulsm4. as quoted from the C99 standard document, section 7.19.6.1: "8. The conversion specifiers and their meanings are: d,i The int argument is converted to signed decimal in the style [−]dddd. The precision specifies the minimum number of digits to appear; if the value being converted can be represented in fewer digits, it is expanded with leading zeros. The default precision is 1. The result of converting a zero value with a precision of zero is no characters." in other words, they are treated the same. – john.k.doe Jul 22 '12 at 04:02
  • @paulsm4 thanks for pointing out the related question, it helps – mko Jul 22 '12 at 04:05
  • 2
    @yozloy - my pleasure. My comment seemed to get deleted, so let me repeat it here: `"%i" and "%d" are identical for printf but different for scanf`: [difference-between-format-specifiers-i-and-d-in-printf](http://stackoverflow.com/questions/1893490/difference-between-format-specifiers-i-and-d-in-printf) – paulsm4 Jul 22 '12 at 05:07

1 Answers1

5

There is no difference.

From the C99 standard document, section 7.19.6.1:

d, i

The int argument is converted to signed decimal in the style [−]dddd. The precision specifies the minimum number of digits to appear; if the value being converted can be represented in fewer digits, it is expanded with leading zeros. The default precision is 1. The result of converting a zero value with a precision of zero is no characters

xdazz
  • 158,678
  • 38
  • 247
  • 274