6

I have come across a borrowchecker problem using Rust 2018 that I cannot find the solution to. Basically, I have a function that takes a mutable reference to a vec, and as the first part of its execution passes that same vec into another function as an immutable reference. The latter function returns a new owned value - or at least I intend it to. The problem for me is that the compiler seems to regard the immutable borrow for the function call as lasting until the end of the outer function.

Unfortunately, this isn't a problem that is solved simply by putting braces around things (it shouldn't be anyway since I'm using Rust 2018). Moreover, while I have found a number of SO questions that appear to touch on similar matters (e.g. this, this, this and this), I haven't been able to find anything else that directly addresses this problem. Or at least, nothing where I have been able to work out what I should do from it. Crucially, most other similar questions either seem to involve a reference as the return type or were only an issue before non-lexical lifetimes.

I have created an executable MVE in the Rust Playground, and the full program in case it helps. I post the code below, for reference:

// This function was blatantly borrowed from a Stack Overflow post
// but unfortunately I lost track of which one.
fn compute_mean_of_vec<'g, T>(input_vec: &'g [T]) -> T
where
    T: Copy
        + num::Zero
        + std::ops::Add<T, Output = T>
        + std::ops::Div<T, Output = T>
        + num::FromPrimitive
        + std::iter::Sum<&'g T>,
{
    let sum: T = input_vec.iter().sum();
    sum / num::FromPrimitive::from_usize(input_vec.len()).unwrap()
}

fn normalise_cost_vec<'a, T>(cost_vec: &'a mut Vec<T>)
where
    T: std::ops::SubAssign
        + Copy
        + num::traits::identities::Zero
        + std::ops::Div<Output = T>
        + num::traits::cast::FromPrimitive
        + std::iter::Sum<&'a T>,
{
    let mean = compute_mean_of_vec(cost_vec);
    for c in cost_vec.iter_mut() {
        *c -= mean;
    }
}

fn main() {
    let mut my_vec = vec![5.0f32; 5];
    normalise_cost_vec(&mut my_vec);
    for e in my_vec.iter() {
        println!("{}", e);
    }
}

The error message the compiler produces is:

error[E0502]: cannot borrow `*cost_vec` as mutable because it is also borrowed as immutable
  --> src/main.rs:26:14
   |
16 | fn normalise_cost_vec<'a, T>(cost_vec: &'a mut Vec<T>)
   |                       -- lifetime `'a` defined here
...
25 |     let mean = compute_mean_of_vec(cost_vec);
   |                -----------------------------
   |                |                   |
   |                |                   immutable borrow occurs here
   |                argument requires that `*cost_vec` is borrowed for `'a`
26 |     for c in cost_vec.iter_mut() {
   |              ^^^^^^^^ mutable borrow occurs here

Looking at the error message, it looks to me like there is probably some issue with the lifetimes specified on the two functions. I have to admit that the ones I included were pretty much just put there according to the suggestions from the compiler and Clippy, I don't fully understand them. Best as I can tell, the compiler somehow thinks that the immutable borrow in the call to compute_mean_of_vec should last for the entirety of the remainder of the call to normalise_cost_vec.

What have I done wrong, and how can I make the compiler happy? I guess it has something to do with specifying another lifetime, but I haven't been able to work out the correct approach, despite looking at The Book and a number of online resources.

hellow
  • 12,430
  • 7
  • 56
  • 79
Jarak
  • 972
  • 9
  • 16

3 Answers3

3

The problem is the Sum trait, let's look at its declaration:

pub trait Sum<A = Self> {
    fn sum<I>(iter: I) -> Self
    where
        I: Iterator<Item = A>;
}

This means, that there is a reference bound to the function, that is valid to live even after the functions ends (theoretically). Therefore you get a "also borrowed as immutable" error.

The solution to this is now instead of using the Sum trait, you can use fold, because you already have a default value (num::Zero) and the Add trait required for your T.

fn compute_mean_of_vec<'g, T>(input_vec: &'g [T]) -> T
where
    T: Copy
        + num::Zero
        + std::ops::Add<T, Output = T>
        + std::ops::Div<T, Output = T>
        + num::FromPrimitive,
{
    let sum: T = input_vec.iter().fold(T::zero(), |a, e| a + *e);
    sum / num::FromPrimitive::from_usize(input_vec.len()).unwrap()
}

fn normalise_cost_vec<'a, T>(cost_vec: &'a mut Vec<T>)
where
    T: std::ops::SubAssign
        + Copy
        + num::traits::identities::Zero
        + std::ops::Div<Output = T>
        + num::traits::cast::FromPrimitive,
{
    let mean = compute_mean_of_vec(cost_vec);
    for c in cost_vec.iter_mut() {
        *c -= mean;
    }
}

(Playground)

Stargateur
  • 24,473
  • 8
  • 65
  • 91
hellow
  • 12,430
  • 7
  • 56
  • 79
  • 1
    I have selected another answer as 'the solution' because it lets me continue using `sum`, which I prefer, but I _very_ much appreciate your clear explanation of the issue and suggestion. Thank you @hellow! :) – Jarak Aug 05 '19 at 21:05
3

It seems that the problem was with the Sum trait's lifetime parameter, and here is a solution without removing this trait

fn compute_mean_of_vec<'g, T>(input_vec: &'g Vec<T>) -> T
where
    for<'x> T: Copy
        + num::Zero
        + std::ops::Add<T, Output = T>
        + std::ops::Div<T, Output = T>
        + num::FromPrimitive
        + std::iter::Sum<&'x T>,
{
    let sum: T = input_vec.iter().sum();
    sum / num::FromPrimitive::from_usize(input_vec.len()).unwrap()
}

fn normalise_cost_vec<'a, T>(cost_vec: &'a mut Vec<T>)
where
    for<'x> T: std::ops::SubAssign
        + Copy
        + num::traits::identities::Zero
        + std::ops::Div<Output = T>
        + num::traits::cast::FromPrimitive
        + std::iter::Sum<&'x T>,
{
    let mean = compute_mean_of_vec(cost_vec);
    for c in cost_vec.iter_mut() {
        *c -= mean;
    }
}

fn main() {
    let mut my_vec = vec![5.0f32; 5];
    normalise_cost_vec(&mut my_vec);
    for e in my_vec.iter() {
        println!("{}", e);
    }
}

i.e., by specifying an standalone lifetime parameter for the trait Sum, the parameter 'g won't be assumed to be carried along the whole function.

Stargateur
  • 24,473
  • 8
  • 65
  • 91
Phosphorus15
  • 134
  • 7
  • This is _exactly_ what I thinking along the lines of, and hoping would be possible. Thank you very much, this has solved my problem in precisely the way I was hoping for! :) – Jarak Aug 05 '19 at 21:03
1

The Solution I found is not to use std::iter::Sum and rewrite the sum call using fold:

fn compute_mean_of_vec<T>(input_vec: &[T]) -> T
where
    T: Copy
        + num::Zero
        + std::ops::Add<T, Output = T>
        + std::ops::Div<T, Output = T>
        + num::FromPrimitive,
{
    let sum: T = input_vec.into_iter().fold(T::zero(), |acc, &item| acc + item);
    sum / num::FromPrimitive::from_usize(input_vec.len()).unwrap()
}

So you do not bind a mean value to the lifetime of input vec and compiler is happy.

Zefick
  • 2,014
  • 15
  • 19