2

I apologize in advance for the code dump. I trimmed it down as much as I could without losing the context of my question (in bold below).

I have a struct

use std::rand;
use std::rand::Rng;
use std::rand::distributions::{Weighted, WeightedChoice, Sample, IndependentSample};

struct MarkovChain {
    state: uint,
    weights: Vec<Vec<uint>>,
}

modeling a Markov chain. There are some checks on the dimensions of the weights matrix that I implement in MarkovChain::new:

impl MarkovChain {
    fn new(weights: Vec<Vec<uint>>, initial_state: uint) -> MarkovChain {
        let states = weights.len();
        assert!(states > 0);
        assert!(initial_state < states);
        assert!(weights.iter().all(|row| row.len() == states));
        MarkovChain {
            state: initial_state,
            weights: weights,
        }
    }
}

Now I implement Sample:

impl Sample<uint> for MarkovChain {
    fn sample<R: Rng>(&mut self, rng: &mut R) -> uint {
        // I'd like to put the following part in MarkovChain::new
        // instead, but I can't figure out how to store the
        // WeightedChoice inside the MarkovChain struct.
        //BEGIN
        let mut row = self.weights[self.state]
            .iter()
            .enumerate()
            .map(|(i, &wt)| Weighted { item: i, weight: wt })
            .collect::<Vec<Weighted<uint>>>();
        let wc = WeightedChoice::new(row.as_mut_slice());
        //END

        self.state = wc.ind_sample(rng);
        self.state
    }
}

The problem here is that row and wc need to be constructed every time sample is called. Given that the typical use case involves calling sample many, many times, this is a problem.

I'd like to move the calculation of row and wc (for each state) into MarkovChain::new instead, but I can't seem to figure out how to store a WeightedChoice in the MarkovChain struct. How do I do that?

I can't tell if this is actually difficult, or if I'm suffering brain damage from too many years of garbage collected languages.


Here's an example usage of the Markov chain. I want to keep the interface unchanged, if possible:

fn main() {
    // Create the 3-state Markov chain illustrated at
    // https://en.wikipedia.org/w/index.php?title=Markov_chain&oldid=626307401#Example
    let mut mc = MarkovChain::new(vec![vec![900,  75,  25],
                                       vec![150, 800,  50],
                                       vec![250, 250, 500]], 0);

    // Expect around 62.5% 0s, 31.25% 1s, and 6.25% 2s after many iterations.
    let rng = &mut rand::task_rng();
    let mut stats = vec![0u, 0, 0];
    for _ in range(0u, 10000) {
        *stats.get_mut(mc.sample(rng)) += 1;
    }
    println!("Expect approximately [6250, 3125, 625]:");
    println!("{}", stats);
}

Here's the whole thing in the Rust playpen.

Snowball
  • 11,102
  • 3
  • 34
  • 51
  • 1
    For anyone experiencing déjà vu, this question is distinct from my [previous question](https://stackoverflow.com/questions/26230437/how-do-i-return-a-weightedchoice-from-a-function), which was about *returning* a WeightedChoice. – Snowball Oct 08 '14 at 21:28
  • You can pretty easily compute your `row`s in `new()`, but I don't see how you can do the same for the `WeightedChoice`s. The issue is that you have no way to prove to the compiler that the reference it borrows is safely owned by your `MarkovChain` struct... – Levans Oct 08 '14 at 21:53
  • Unfortunately, even moving the just the `row`s into `new()` is tricky: `WeightedChoice::new(&'a mut [Weighted])` modifies its argument, so creating multiple `wc`s for the same `row` skews the probability distribution. – Snowball Oct 08 '14 at 22:02
  • 1
    See this related question: [How to initialize struct fields which reference each other](http://stackoverflow.com/q/25269597/234590) – Francis Gagné Oct 09 '14 at 01:01

0 Answers0