0

I am writing a very big data to file, it is the result of the Huffman encoding which I need to save to a file and read it later. I came across the idea of using dynamic_bitset for handling the bits. I have tested my code on small data and it works fine but on a bigger data (encoding an image) it fails and crashes. This is how the variable that I'm trying to save looks like:

_encodedSig {m_bits={ size=46944 } m_num_bits=1502195 } boost::dynamic_bitset<unsigned long,std::allocator<unsigned long> > &

However, when I use the function size() it returns 16 which is confusing for me. I tried testing to convert it to unsigned long but it throws an exception. I wonder how much data can dynamic bit_set handle and why the m_num_bits is not equal to size(). I will appreciate any thoughts and ideas.

sehe
  • 374,641
  • 47
  • 450
  • 633
user3178756
  • 555
  • 1
  • 5
  • 17
  • what language is `_encodedSig {m_bits={ size=46944 } m_num_bits=1502195 }`? `I tried testing to convert it to unsigned long but it throws an exception` - how? Because casting won't throw that. `why the m_num_bits is not equal to size()` - I'd suspect [Undefined Behaviour](http://en.cppreference.com/w/cpp/language/ub) somewhere – sehe Feb 13 '17 at 15:13
  • Maybe you find this interesting: https://stackoverflow.com/questions/31006792/how-to-serialize-boostdynamic-bitset/31015623#31015623 – sehe Feb 13 '17 at 15:14
  • 1
    @sehe It is the output of debugging and how the variable looks like during the run. It is undefined behavior and I just found the problem. I was using size_t to get the size of the bits. now I changed it to unsinged long long and it works. but I wonder if it's going to be a problem when my data is larger. Thanks for the link, I am actually using the same method to save to file. – user3178756 Feb 13 '17 at 15:53

0 Answers0