Part of a program I'm writing involves getting a list of integers (e.g. 15, 18, 25) and converting each one to binary. I'm iterating through the list and using the following line of code to convert each one:
std::string binary = std::bitset<8>(v).to_string();
(the (v) is the integer I'm converting)
but the problem with this line of code is that it defines the length of the outputted binary string, so 2 would become "00000010" and 31 would become "00011111" of course I cant it make too low or else im going to have some trouble with larger numbers, but I want the length of each binary string to be equal to the real binary number (2 is "10", 31 is "11111"). I have my reasons for this.
So I tried replacing the <8> with an int that changes based on the number I'm trying to convert based on the following code:
int length_of_binary;
if (v <= 1) {
length_of_binary = 1;
}
else if (v <= 3) {
length_of_binary = 2;
}
else if (v <= 8) {
length_of_binary = 4;
}
else if (v <= 16) {
length_of_binary = 5;
}
else if (v <= 32) {
length_of_binary = 6;
}
std::string binary = std::bitset<length_of_binary>(v).to_string();
The problem is that i get the following error when hovering over the (now under-waved) variable length_of_binary:
"+5 overloads. expression must have a constant value."
and the program won't compile. I even tried tricking the compiler by assigning
the value of length_of_binary to a const int but it still won't work.
Is there a way to fix this? if not is there a piece of code/function that will give me what I need?