1

I'm not sure what the technical term is here so a term I can search would be appreciated.

Let's say a character has multiple decisions of different weights.

Decision A: 1
Decision B: 3
Decision C: 5
Sum: 9

What the code does is that it adds them together, such that there's a 1/9 odds of making Decision A, 3/9 of making Decision B, 5/9 of making Decision C.

There are factors that remove and add certain decisions from the pool. These weights are not fixed (e.g. B may be 2 for more intelligent characters, or split into B1 and B2 with their own respective weight).

Right now what I'm doing is just a linear search like the following (in JavaScript):

let totalWeight = 0;
for (let i = array.length - 1; i >= 0; i--) {
    totalWeight += array[i].weight;
}

// this function rolls a random number from 1 to totalWeight
let r = roll(1, totalWeight); 
let search = 1;
for (let i = 0; i < array.length; i++) {
    let w = array[i].weight;
    if (r >= search && r < (search+w)){
        return array[i];
    }
    search += w;
}

But this doesn't seem very efficient. It looks like there can be a binary search algorithm here but I can't seem to think of one. Any ideas?

LEE Hau Chon
  • 435
  • 3
  • 12
Muz
  • 5,866
  • 3
  • 47
  • 65

2 Answers2

2

If the weights change every round and there is neither commonality nor invariants between different rounds I do not think there is an algorithm that can significantly outperform a linear scan.

Here is a list of algorithms for performing this task.

GZ0
  • 4,055
  • 1
  • 10
  • 21
2

After I looked into the code you've typed, I think the technique/algorithm rejection sampling is the one what you are looking for. To obtain the same output of your code using rejection sampling:

var sample = []; 
for (let i = array.length - 1; i >= 0; i--) {
    for(let j = array[i].weight-1;j>=0;j--) {
        sample.push(i);
    }
}
// this function rolls a random number from 0 to sample.length-1
// which sample.length should be equivalent to your total weight 
let r = roll(0, sample.length-1);
return array[sample[r]];

The code above decreases the time complexity, yet increase the space complexity.

If you are trying to implement binary search in your algorithm without rejection sampling, try the following code:

let totalWeight = 0;
//add one property into your array, call it accumulative_weight or aw

for (let i = array.length - 1; i >= 0; i--) {
    totalWeight += array[i].weight;
    //assign the accumulative_weight property 
    array.aw = totalWeight;
}

// this function rolls a random number from 1 to totalWeight
let r = roll(1, totalWeight); 
let start = 0;
let end = array.length;
let position = "not found";
while(start!=end)
{
    let target = parseInt((end-start)/2);
    if( array[target].aw > r )
        end = target;
    else if ( array[target].aw - array[target].weight < r )
        start = target;
    else
    {
        let position = target;
        break; 
    }
}
return position;

Please note that your array must be sorted. Hope it helps.

LEE Hau Chon
  • 435
  • 3
  • 12
  • 1
    The main issue of this problem is that the weights change for every function call. Normal sampling algorithms either require O(n) setup time & O(1) / O(log n) sample time or vice versa. If the precomputed arrays cannot be (partially) reused between different calls, I do not think any other sampling algorithm would be much better than a simple linear scan. – GZ0 Jun 24 '19 at 07:40