I want to programmatically convert a string of characters stored in a file to a string of character codes (encode) by following a code table. The string of binary codes should then go to a file, from which I can revert it back to the string of characters later (decode). The codes in the code table were generated using Huffman algorithm and the code table is stored in a file.
For example, by following a code table where characters and its corresponding codes are single spaced like this:
E 110
H 001
L 11
O 111
encoding "HELLO" should output as "0011101111111"
My C++ code cannot seem to complete the encoded string. Here is my code:
int main
{
string English;
ifstream infile("English.txt");
if (!infile.is_open())
{
cout << "Cannot open file.\n";
exit(1);
}
while (!infile.eof())
{
getline (infile,English);
}
infile.close();
cout<<endl;
cout<<"This is the text in the file:"<<endl<<endl;
cout<<English<<endl<<endl;
ofstream codefile("codefile.txt");
ofstream outfile ("compressed.txt");
ifstream codefile_input("codefile.txt");
char ch;
string st;
for (int i=0; i<English.length();)
{
while(!codefile_input.eof())
{
codefile_input >> ch >> st;
if (English[i] == ch)
{
outfile<<st;
cout<<st;
i++;
}
}
}
return 0;
}
For an input string of "The_Quick_brown_fox_jumps_over_the_lazy_dog", the output string is 011100110, but it should be longer than that!
Please help! Is there anything I have missed? (n.b. my C++ code has no syntax errors)