-3

I've pored over the chess programming wiki on bit-boards, but i'm still struggling with how i'm supposed to actually create them. From what I've gathered they're supposed to be a uint64_t right? Sometimes I see them represented as long hex looking numbers like in stockfish, and other times I see them represented as a 64 digit binary number.

How do I take, for example, an array of chessboard positions and convert those into bit-boards for each piece and then two for color?

Max C
  • 369
  • 5
  • 16
  • I'm afraid we have no idea what you're talking about. If you have a problem with some code, present the [MCVE] you've been debugging for the past few days. – Lightness Races in Orbit Jun 26 '17 at 00:50
  • 1
    And you *have* read [the bitboard wiki page](https://chessprogramming.wikispaces.com/Bitboards)? Because it answers your question almost immediately. – Some programmer dude Jun 26 '17 at 00:50
  • Yes, I've been reading it for the past 3-4 days. When I tried declaring uint64_t numbers and tried modifying them my debugger was just showing what the number was in decimal. Is that normal? Also confused about whether they should be represented in binary or hex? – Max C Jun 26 '17 at 00:56
  • Binary, octal, decimal and hexadecimal are just ways to *present* integer values. It's all stored in binary in the computer anyway. Your debugger will have a setting to show values as either decimal or hexadecimal (and maybe the others as well), you just have to find it. As for how you should *present* the values, I'd say it's totally up to you. If both binary and hexadecimal are commonly used, then pick whatever you fancy. – Some programmer dude Jun 26 '17 at 01:02
  • Thanks man, was finally was able to define a bitboard. I just didn't know enough about define/typedef to write the correct code. – Max C Jun 26 '17 at 01:15

2 Answers2

2

Since things like unsigned long long are not guaranteed to have any particular number of bits, making using of cstdint is a good idea here, like so:

#include <cstdint>
uint64_t board;

However, using std::bitset is likely to generate more readable code with less effort:

#include <bitset>
#include <cassert>

class BitBoard {
 private:
  std::bitset<64> board;
 public:
  auto operator()(int x, int y){
    assert(0<=x && x<=7);
    assert(0<=y && y<=7);
    return board[8*y+x];
  }
  void setAll(bool val){
    if(val)
      board.set();   //All bits on
    else
      board.reset(); //All bits off
  }
};

int main(){
  BitBoard board;
}

I am not sure off-handedly which one would be more performant. Others have posted thoughts about performance here.

Richard
  • 56,349
  • 34
  • 180
  • 251
  • This is what I finally settled on once I got it to work from the chess programming wiki: `typedef unsigned long long U64; #define C64(constantU64) constantU64##ULL` Will that have the same problem you mentioned? – Max C Jun 26 '17 at 01:46
  • @MaxC: maybe. It depends on your environment. If is possible, for instance, for `unsigned long long` to be either 32 bits or 128 bits. Using `uint64_t` _guarantees_ the number of bits. Using `bitset` will allow you to focus on code development with the option of performing optimizations later. – Richard Jun 26 '17 at 01:49
  • 2
    @MaxC: No worries! Since you're a new user, I'll remind you that if you found this question helpful you can upvote it by hitting the arrow near it, or accept it by hitting the outline of a checkmark. – Richard Jun 26 '17 at 01:58
0

Finally was able to define a bit board like so:

typedef unsigned long long  U64; // supported by MSC 13.00+ and C99
#define C64(constantU64) constantU64##ULL    

U64 BBFullBoard  = 0xffffffffffffffffULL;
Max C
  • 369
  • 5
  • 16