8

My guess is O(n) where n is the no. of bits. Or is it constant w.r.t. n? I mean it shouldn’t it just be able to copy the bits from memory?

Foon
  • 221
  • 1
  • 7
  • 6
    I don't think there's any complexity specified, but O(n) sounds reasonable. And I'm a little curious about why are you asking? Is it just plain curiosity, or do you have a more concrete problem? – Some programmer dude Aug 27 '18 at 13:49
  • 2
    But `long` has a given number of bits fixed at compilation time, so what is the difference between O(n) and constant complexity if n is always the same number? – rodrigo Aug 27 '18 at 13:51
  • 2
    I'd assume it is constant as long as the bitset size is less than or equal to the size of `unsigned long long` – NathanOliver Aug 27 '18 at 13:51
  • 5
    It could change depending on the implementation, so knowing about your compiler would help. As an example, [the source of the GNU STL](http://www.aoc.nrao.edu/php/tjuerges/ALMA/STL/html-3.4.6/bitset-source.html) shows that it is O(1). – bracco23 Aug 27 '18 at 13:52
  • @Someprogrammerdude I was curious about its performance compared to making a bit array manually by dividing by 2. – Foon Aug 27 '18 at 13:55
  • I also think O(1), because the interal type is bounded to 64bit thus it should be a uint64_t. – hellow Aug 27 '18 at 13:55
  • 4
    @Foon: This definitely sounds like a premature optimization. – Jaa-c Aug 27 '18 at 13:56
  • 1
    I'd assume that the actual content is stored as real memory bits (this justifies existence of `reference` member class), so compiler could simply copy contents of `long` to the memory occupied by `std::bitset`. – Yksisarvinen Aug 27 '18 at 13:59
  • @Jaa-c DEFINITELY but I was feeling irritated for not finding a concrete answer. – Foon Aug 27 '18 at 14:05
  • @Yksisarvinen something similar is said here http://www.cplusplus.com/reference/bitset/bitset/bitset/ under “val” – Foon Aug 27 '18 at 14:07
  • cpp.sh/5i3op It seems to be the case for GCC. `sizeof(std::bitset<64>)` is 8, `sizeof(std::bitset<65>)` is 16. I'd expect any sane compiler to do memcpy between int and the bitset (or optimize it and use registers). – Yksisarvinen Aug 27 '18 at 14:10
  • @Foon I wouldn't trust the wording from cplusplus.com. But yes, it seems reasonable. I hope somebody will quote the standard to clarify it in answers. – Yksisarvinen Aug 27 '18 at 14:15

1 Answers1

9

Mathematically speaking, long has a fixed length, therefore copying it's contents is constant-time operation. On the other hand, you need to zero-out the rest of the bits in the bitset and that you are not able to do in less-than-linear time with respect to the length of the bit_set. So, In theory you cannot do better than O(n), where n is the length of the bitset.

I guess that from the asymptotical complexity point of view you can safely assume that the complexity of the constructor is the same as zeroing-out the allocated memory.

This analysis however has some value only for huge values of n and it does not make much sense to me to use a long constructor to initialize a bitset of million bits. So, if the size of the bitset is on the same scale as the size of long, it is practically constant-time.

jlanik
  • 859
  • 5
  • 12