19

I saw format specifier %qd when browsing github code. Then I checked in GCC compiler, it's working fine.

#include <stdio.h>

int main()
{  
    long long num = 1;
    printf("%qd\n", num);
    return 0;
}

What is the purpose of format specifier %qd in printf()?

UkFLSUI
  • 5,509
  • 6
  • 32
  • 47
msc
  • 33,420
  • 29
  • 119
  • 214
  • 4
    The more portable example would probably be with `int64_t`. But not a bad Q&A – StoryTeller - Unslander Monica Jun 13 '18 at 06:25
  • 8
    The `q` length modifier is a non-portable, non-reliable, antiquated, deprecated way of writing `ll`. Use `ll` in your code; encourage those who maintain the code on GitHub to upgrade to standard C99 formats. – Jonathan Leffler Jun 13 '18 at 06:45
  • `printf` formats are implemented by the runtime library, not by the compiler. This is not a gcc (or egcs, or llvm) issue. On my Ubuntu 20.04 system, for example, `%qd` works with gcc and glibc, but fails with gcc and musl -- same compiler, different runtime libraries. – Keith Thompson Jul 04 '20 at 20:33

3 Answers3

17

%qd was intended to handle 64 bits comfortably on all machines, and was originally a bsd-ism (quad_t).

However, egcs (and gcc before that) treats it as equivalent to ll, which is not always equivalent: openbsd-alpha is configured so that long is 64 bits, and hence quad_t is typedef'ed to long. In that particular case, the printf-like attribute doesn't work as intended.

If sizeof(long long) == sizeof(long) on openbsd-alpha, it should work anyway - i.e. %ld, %lld, and %qd should be interchangeable. On OpenBSD/alpha, sizeof(long) == sizeof(long long) == 8.

Several platform-specific length options came to exist prior to widespread use of the ISO C99 extensions, q was one of them. It was used for integer types, which causes printf to expect a 64-bit (quad word) integer argument. It is commonly found in BSD platforms.

However, both of the C99 and C11 says nothing about length modifier q. The macOS (BSD) manual page for fprintf() marks q as deprecated. So, using ll is recommended in stead of q.

References:

https://gcc.gnu.org/ml/gcc-bugs/1999-02n/msg00166.html

https://en.wikipedia.org/wiki/Printf_format_string

https://port70.net/~nsz/c/c11/n1570.html#7.21.6.1p7

UkFLSUI
  • 5,509
  • 6
  • 32
  • 47
  • `printf` formats are implemented by the runtime library, not by the compiler. This is not a gcc (or egcs, or llvm) issue. On my Ubuntu 20.04 system, for example, `%qd` works with gcc and glibc, but fails with gcc and musl -- same compiler, different runtime libraries. – Keith Thompson Jul 04 '20 at 20:33
8

q means quad word format specifier in printf function which is used to handle 64 bits comfortably on all machines.

From Wikipedia:

Additionally, several platform-specific length options came to exist prior to widespread use of the ISO C99 extensions:

q - For integer types, causes printf to expect a 64-bit (quad word) integer argument. Commonly found in BSD platforms

Community
  • 1
  • 1
msc
  • 33,420
  • 29
  • 119
  • 214
  • 5
    By the way, for future reference. There is a "answer your own question" checkbox at the bottom of the [ask a question](https://stackoverflow.com/questions/ask) form. It allows you to post them together, and not rush to post them a minute apart. – StoryTeller - Unslander Monica Jun 13 '18 at 06:27
  • 10
    C99 says nothing about length modifier `q`, and neither does C11 (see [§7.21.6.1 The `fprintf` function ¶7](https://port70.net/~nsz/c/c11/n1570.html#7.21.6.1p7)). The macOS (BSD) manual page for `fprintf()` marks `q` as deprecated. The Wikipedia page does not describe it as a C99 extension, but rather as a platform-specific option that existed prior to the C99 extensions being created. – Jonathan Leffler Jun 13 '18 at 06:39
  • 3
    It is important to emphasis that this is a GNU C extension, it has never been part of the C language. I added the gcc tag to the question. – Lundin Jun 13 '18 at 07:38
5

One of most interesting C language related question to answer. The symbolic literal “%qd” represent as quad word, which is specified as used to handle 64 bits effectively with the printf function in the C programming language. Also just remember that, from 1999 edition of the C standard states that sizeof(long long) >= sizeof(long), and one can infer that the range of long long has a size of at least 64 bits.

valiano
  • 16,433
  • 7
  • 64
  • 79
Aravinda Meewalaarachchi
  • 2,551
  • 1
  • 27
  • 24
  • 1
    Since the minimum size of `long` is 32-bits, the fact that `sizeof(long long) >= sizeof(long)` only allows you to infer that the size of `long long` must be at least 32 bits. However, [C11 §5.2.4.2.1 Sizes of integer types ``](https://port70.net/~nsz/c/c11/n1570.html#5.2.4.2.1) says: _… Their implementation-defined values shall be equal or greater in magnitude (absolute value) to those shown, with the same sign. … minimum value for an object of type long long int `LLONG_MIN` `-9223372036854775807` // −(2⁶³ − 1)_ which does require `long long` to be a 64-bit type. – Jonathan Leffler Jun 13 '18 at 17:36