I am experimenting with this open source LZW compression code off github. The code compresses my binary file correctly, but when trying to decompress it, the code calls getchar
and reaches EOF
within 20 characters from a 50,000+ character file.
Github Repository:
https://github.com/geoffreylitt/lzw
I looked through the decode function to see what was wrong. Eventually I trimmed the code down to only this:
void decode()
{
int c = 0;
while ( (c = getchar()) != EOF) {
fprintf(stderr, "%i\n", c);
}
}
This is the makefile:
CC=gcc
CFLAGS=-O3 -g3 --std=c99 -Wall
all: encode decode
encode: main.c hasharray.c encode.c decode.c bitio.c stack.c globals.c
$(CC) $(CFLAGS) -o ../bin/encode $^
decode: encode
ln -f ../bin/encode ../bin/decode
And this is how I was executing the script:
./decode <encoded.bin> decoded.bin
I think it has something to do with how stdin handles the file input. But I don't have much of an understanding of how it handles an input stream from <encoded.bin>
.
On Ubuntu 16.04 near fresh install, it works perfectly with the same code, same files, and same command. I would really like to know why or how they handle this differently so later down the road if my code has the same issue I might realize why.
Thanks!