Using STD bitset. I'm converting a sting to binary using this answer. If I do this it works
string myString = "Hi";
for ( int i = 0; i < myString.size(); ++i)
{
cout << bitset<8>(myString.c_str()[i]) << endl;
}
If I do this it works
string myString = "Hi";
for ( int i = 0; i < myString.size(); ++i)
{
cout << bitset<8>foo(myString.c_str()[i]) << endl;
}
But this doesn't work, I want to know why
string myString = "Hi";
bitset<8>foo;
for ( int i = 0; i < myString.size(); ++i)
{
cout <<foo(myString.c_str()[i]) << endl;
}
I getno match for call to ‘(std::bitset<8>) (const char&)’
I think I know how to fix it, but I don't understand why is this happening? Can't you insert into bitset after declaration?
Lets try one more time, something like this would work
for (std::size_t i = 0; i < myString.size(); ++i)
{
bitset<8>bar(myString.c_str()[i]);
foo[i] = bar[i];
}
Now this works but only 8 bits exits in foo
and everything is correct in bar
plus I don't like it, it seems to much code.
All I want is to declare foo
and then insert bits to it in the loop, what am I missing? I don't want to use any third party library.