I've been racking my brain on this for a good hour or so.
It's a simple check on whether a string is palindrome or not.
#include <stdlib.h>
#include <stdio.h>
#include <string.h>
#define LEN 20
int main()
{
char word[LEN], drow[LEN];
printf("Insert a word: ");
fgets(word, LEN + 1, stdin);
strcpy(drow, word);
strrev(word);
if(strcmp(drow,word) == 0){
printf("The string is a palindrome\n");
}
else{
printf("The string isn't a palindrome\n");
}
return 0;
}
This code doesn't work, here's the output (with some more debug code to show what's happening)
(This is the version with the debug code implemented)
char word[LEN], drow[LEN];
printf("Insert a string: ");
fgets(word, LEN + 1, stdin);
strcpy(drow, word);
strrev(word);
puts(word);
puts(drow);
printf("strcmp=%d\n", strcmp(drow, word));
if(strcmp(drow,word) == 0){
printf("The string is a palindrome\n");
}
else{
printf("The string isn't a palindrome\n");
}
return 0;
Insert a word: tacocat
tacocat //this is the "word" string
tacocat //this is the "drow" string
strcmp=1 //the return of strcmp
The string isn't a palindrome
Process returned 0 (0x0) execution time : 4.589 s
But here's what happens when the fgets()
at line 11 gets replaced by gets(word)
Insert a word: tacocat
tacocat
tacocat
strcmp=0
The string is a palindrome
Process returned 0 (0x0) execution time : 3.897 s
I can't understand why gets()
works, but fgets()
doesn't.
I've tried replacing the max length of the string with sizeof, but it still doesn't.