0

I'd like to perform an exhaustive analysis of correct play for Cribbage, which has two six-card hands and a cut card. https://en.wikipedia.org/wiki/Cribbage

In Cribbage in an end game position, suits can become entirely irrelevant.

The key features of cribbage are:

  1. Six cards dealt to each player, who discards 2, meaning 6C4 = 15 possible discards, per player.
  2. Then a cut card.
  3. 4!4! = 576 possible ways (some ways might be illegal for certain hands) for the two players to play the four cards thus retained, in the sub-game called 'pegging'
  4. Dealer and non-dealer's decisions are not at all symmetrical in that non-dealer's discards form a second dealer hand, so that dealer wants to discard good cards there and non-dealer wants to discard bad cards. In addition, the structure of the pegging game is such that different cards will be good for dealer and for non-dealer.

Hence the problem is for any given six-card hand for dealer and non-dealer respectively to return the best four-card hand that should be held, for a given score (play is to 121, so play will be different at different scores, given that it's a turn-based game, and the first to 121 takes all). Then from each four-card hand for three known dead cards (the cut and our discards), the best lead, second [according to a given reply], third, fourth etc.

I am not completely sure how to structure this problem, but we can see that with 52C6 * 46C6 * 40 = 7.63 * 10^15 possible starting positions, and then 6C4 * 6C4 * 4!4! = 129,600 possible decision paths for each, that this could become too large a problem to solve.

In order to simplify, I'm considering the case where suits are irrelevant. Here there are no more than 18!/12!/6! = 18,564 hands for the first player, rather than 52!/6!/46! = 20,358,520 if suits are relevant, or a smaller number if we try to compare equivalent hands similar to Generating all 5 card poker hands.

The algorithm I have come up with is to iterate over the ranks of cards, with repetition, for a six-card, six-card and then cut card. This generates unique patterns, however it doesn't rule out sequences such 10,10,10,10,10,10,10,10, which are impossible since there are only four cards of each rank in the deck.

This is not too hard to fix in that there are in general 4C4 = 1 ways to pick all four cards of a given rank, 4C3 = 4, 4C2 = 6, and 4C1 = 4 ways to pick 3, 2, 1, cards of a different suit. This means that for example the 13 cards 0,1,2,3,4,5,6,7,8,9,10,11,12 can be formed in 4^13 ways. Any case where there are more than 4 of a given rank, then there are therefore 0 ways to achieve that, and we don't need to evaluate the player choices for that set of cards.

I have written some code that gets as far as counting the ways to achieve each set, but it's extremely slow, and spends more than 90% of its time calculating the weights.

I tried firstly Linq groupBy, which was terribly slow, so I switched to a Dictionary, and now an array.

However something still seems to be wrong in that it's taking a very long time to count.

    readonly byte[] ranks = { 0, 1, 2, 3, 4, 5, 6, 7, 8, 9, 10, 11, 12};

    // main method, pass any parameter as they're not implemented yet
    bool PlayCribbage(byte poneScore, byte dealerScore)
    {
        Stopwatch sw = new Stopwatch();
        sw.Start();
        var dealerHands = GetCombinations<byte>(ranks, 6);
        long totalCombos = 0;
        long validCombos = 0;
        foreach (var dealerHand in dealerHands)
        {
            var playerHands = GetCombinations<byte>(ranks, 6);
            foreach (var playerHand in playerHands)
            {
                foreach (var cut in ranks)
                {
                    totalCombos++;
                    long weight = this.CountWeight(dealerHand, playerHand, cut);
                    if (weight > 0) validCombos++;

                }
            }
        }
        sw.Stop();
        MessageBox.Show(sw.Elapsed.TotalSeconds.ToString());
        return true;

    }

    // this is where the program is spending all its time
    long CountWeight(IEnumerable<byte> dealer, IEnumerable<byte> player, byte cut)
    {
        long weight = 1;
        byte[] counts = { 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0 };
        // the program spends most of its time iterating over these next two lines of code
        foreach (byte card in dealer) counts[card]++;
        foreach (byte card in player) counts[card]++;
        counts[cut]++;
        foreach (byte count in counts)
        {
            switch (count)
            {
                case 1:
                    weight *= 4;
                    break;
                case 2:
                    weight *= 6;
                    break;
                case 3:
                    weight *= 4;
                    break;
                case 4:
                case 0:
                    break;
                default:
                    return 0;
            }
        }
        return weight;
    }

    IEnumerable<IEnumerable<T>> GetCombinations<T>(IEnumerable<T> items, int count)
    {
        int i = 0;
        foreach (var item in items)
        {
            if (count == 1)
                yield return new T[] { item };
            else
            {
                foreach (var result in GetCombinations(items.Skip(i), count - 1))
                    yield return new T[] { item }.Concat(result);
            }
            ++i;
        }
    }

It could be I'm barking up the wrong tree somehow. Or maybe I'm just expecting things to happen quicker than they do on a single-threaded i5-7300U given 18,564 outer loop items, 18,564 middle loop items, 13 inner loop items and then iterating through the IEnumerable.

Is there something wrong with my approach or with using IEnumerable here?

I can continue as is, but I haven't done any useful work yet and it's already taking an excessive amount of time (an hour!).

Edit: it seems there IS something wrong with IEnumerable, or at least the way it's resolved in a loop here. I've added .ToArray() calls inside the foreach dealerHand/playerHand in dealerHands/playerHands, and this makes it much faster. It seems that because the enumerator is happening many times and iterating through IEnumerable is slower than a byte[] or even List

thelawnet
  • 101
  • 1
  • There is some element of truth in _"maybe I'm just expecting things to happen quicker than they do"_, given that you're expecting it to be running at least 62 billion iterations. – Pranav Hosangadi Aug 03 '20 at 22:25
  • That's true, but there are 12 elements in dealer and player, which are IEnumerable and it's spending nearly all its time iterating over those (assuming the profiler is not lying to me), whereas it doesn't seem to spend much time in the foreach count in counts loop, over an array with 13 elements , which it has to multiply together. Maybe it's something to do with the stage at which the IEnumerable is resolved !? – thelawnet Aug 04 '20 at 07:25

0 Answers0