28
#include<stdio.h>

int main()
{
    char *name = "Vikram";
    printf("%s",name);
    name[1]='s';
    printf("%s",name);
    return 0;
}

There is no output printed on terminal and just get segmentation fault. But when I run it in GDB, I get following -

Program received signal SIGSEGV, Segmentation fault.
0x0000000000400525 in main () at seg2.c:7
7       name[1]='s';
(gdb) 

This means program receive SEG fault on 7th line (obviously I can't write on constant char array) . Then why printf() of line number 6 is not executed ?

Richard J. Ross III
  • 55,009
  • 24
  • 135
  • 201
Vikram
  • 1,999
  • 3
  • 23
  • 35

4 Answers4

56

This is due to stream buffering of stdout. Unless you do fflush(stdout) or you print a newline "\n" the output is may be buffered.

In this case, it's segfaulting before the buffer is flushed and printed.

You can try this instead:

printf("%s",name);
fflush(stdout);        //  Flush the stream.
name[1]='s';           //  Segfault here (undefined behavior)

or:

printf("%s\n",name);   //  Flush the stream with '\n'
name[1]='s';           //  Segfault here (undefined behavior)
Mysticial
  • 464,885
  • 45
  • 335
  • 332
  • 17
    Note that `fflush` is really the right way to do it - a newline is not guaranteed to trigger a flush (and I've been bitten by that behavior before). – Aaron Dufour Feb 27 '12 at 18:30
  • BTW, `puts(name)` is equivalent to `printf("%s\n",name);`, although some people dislike the implicit newline. Compilers will optimize the printf into puts for you. – Peter Cordes Dec 31 '21 at 07:48
  • Including a newline only flushes implicitly without `fflush` for line-buffered stdout, not full buffered; if you redirect stdout to a file, it will be full-buffered. `stderr` is always unbuffered; that's one reason to favour `fprintf(stderr, ...`) for debug-prints around segfaults, if you don't want to include a newline, or might want to redirect the log to a file, and don't want to also use fflush after every call. – Peter Cordes Dec 31 '21 at 07:53
11

First you should end your printfs with "\n" (or at least the last one). But that is not related to the segfault.

When the compiler compiles your code, it splits the binary into several section. Some are read only, while other are writeable. Writing to an read only section may cause a segfault. String literals are usually placed in a read only section (gcc should put it in ".rodata"). The pointer name points to that ro section. Therefore you must use

const char *name = "Vikram";

In my response I've used a few "may" "should". The behaviour depends on your OS, compiler and compilation settings (The linker script defines the sections).

Adding

-Wa,-ahlms=myfile.lst

to gcc's command line produces a file called myfile.lst with the generated assembler code. At the top you can see

    .section .rodata
.LC0:
    .string "Vikram"

Which shows that the string is in Vikram.

The same code using (Must be in global scope, else gcc may store it on the stack, notice it is an array and not a pointer)

char name[] = "Vikram";

produces

    .data
    .type name, @object
    .size name, 7
name:
    .string "Vikram"

The syntax is a bit different but see how it is in .data section now, which is read-write. By the way this example works.

paul
  • 1,212
  • 11
  • 15
  • 2
    If you notice, the OP is not asking why the segfault occurs, but why the string wasn't printed out in the first place. – Richard J. Ross III Feb 27 '12 at 18:39
  • 2
    although this may not be exactly answer to the question, the tip and explain on .rodata and .data is helpful. – vts May 08 '14 at 16:16
5

The reason you are getting a segmentation fault is that C string literals are read only according to the C standard, and you are attempting to write 's' over the second element of the literal array "Vikram".

The reason you are getting no output is because your program is buffering its output and crashes before it has a chance to flush its buffer. The purpose of the stdio library, in addition to providing friendly formatting functions like printf(3), is to reduce the overhead of i/o operations by buffering data in in-memory buffers and only flushing output when necessary, and only performing input occasionally instead of constantly. Actual input and output will not, in the general case, occur at the moment when you call the stdio function, but only when the output buffer is full (or the input buffer is empty).

Things are slightly different if a FILE object has been set so it flushes constantly (like stderr), but in general, that's the gist.

If you're debugging, it is best to fprintf to stderr to assure that your debug printouts will get flushed before a crash.

Perry
  • 4,363
  • 1
  • 17
  • 20
1

By default when stdout is connected to a terminal, the stream is line-buffered. In practice, in your example the absence of '\n' (or of an explicit stream flush) is why you don't get the characters printed.

But in theory undefined behavior is not bounded (from the Standard "behavior [...] for which this International Standard imposes no requirements") and the segfault can happen even before the undefined behavior occurs, for example before the first printf call!

ouah
  • 142,963
  • 15
  • 272
  • 331
  • So... you are saying that the behaviour is so undefined that it can act *backwards in time*? – John Lawrence Aspden Mar 29 '17 at 19:17
  • @JohnLawrenceAspden Probably not in this case but in other programs the compiler might move an assignment to the top of the function or even execute code in parallel. Itanium machine code could do six things in one instruction bundle. Figuring out which one caused a processor trap was interesting. – Zan Lynx Sep 19 '18 at 23:31