0

Though this has been discussed before here and here, I can't quite make a simple example work. In the code

// Create some random type that we want to represent as a Real
struct Foo<Real> {
    x: Real,
    y: Real,
}

// Add the algebra for Foo
impl<Real> std::ops::Add for Foo<Real>
where
    Real: std::ops::Add<Output = Real> + std::ops::Mul<Output = Real>,
{
    type Output = Self;
    fn add(self, other: Self) -> Self::Output {
        Foo {
            x: self.x + other.x,
            y: self.y + other.y,
        }
    }
}

impl<Real> std::ops::Mul for Foo<Real>
where
    Real: std::ops::Add<Output = Real> + std::ops::Mul<Output = Real>,
{
    type Output = Self;
    fn mul(self, other: Self) -> Self::Output {
        Foo {
            x: self.x * other.x,
            y: self.y * other.y,
        }
    }
}

// Compute a function on a slice of Reals
fn foo<Real>(x: &[Real]) -> Real
where
    for<'a> &'a Real: std::ops::Add<Output = Real> + std::ops::Mul<Output = Real>,
{
    &(&x[0] + &x[1]) * &x[2]
}

// Run foo on two different types
fn main() {
    let x = vec![1.2, 2.3, 3.4];
    let _x = foo::<f64>(&x);
    let y: Vec<Foo<f64>> = x.into_iter().map(|z| Foo { x: z, y: z + 1.0 }).collect();
    let _y = foo::<Foo<f64>>(&y);
}

We get the compiler error:

error[E0277]: cannot add `&'a Foo<f64>` to `&'a Foo<f64>`
  --> src/main.rs:47:14
   |
47 |     let _y = foo::<Foo<f64>>(&y);
   |              ^^^^^^^^^^^^^^^ no implementation for `&'a Foo<f64> + &'a Foo<f64>`
   |
   = help: the trait `std::ops::Add` is not implemented for `&'a Foo<f64>`
note: required by `foo`
  --> src/main.rs:35:1
   |
35 | / fn foo<Real>(x: &[Real]) -> Real
36 | | where
37 | |     for<'a> &'a Real: std::ops::Add<Output = Real> + std::ops::Mul<Output = Real>,
38 | | {
39 | |     &(&x[0] + &x[1]) * &x[2]
40 | | }
   | |_^

error[E0277]: cannot multiply `&'a Foo<f64>` to `&'a Foo<f64>`
  --> src/main.rs:47:14
   |
47 |     let _y = foo::<Foo<f64>>(&y);
   |              ^^^^^^^^^^^^^^^ no implementation for `&'a Foo<f64> * &'a Foo<f64>`
   |
   = help: the trait `std::ops::Mul` is not implemented for `&'a Foo<f64>`
note: required by `foo`
  --> src/main.rs:35:1
   |
35 | / fn foo<Real>(x: &[Real]) -> Real
36 | | where
37 | |     for<'a> &'a Real: std::ops::Add<Output = Real> + std::ops::Mul<Output = Real>,
38 | | {
39 | |     &(&x[0] + &x[1]) * &x[2]
40 | | }
   | |_^

This seems to imply that the definitions of Add and Mul are incorrect. Is there an easy way to fix the definitions? In case it makes a difference, it's my desire to have Add and Mul not take ownership of their arguments. Rather, I'd like to have each operator allocate new memory and then give ownership of that variable to the calling function.


Edit 1

As pointed out, this is a duplicate of the following two questions combined

For posterity's sake, here's a working code from @Shepmaster:

use std::ops::{Add, Mul};

// Create some random type that we want to represent as a Real
struct Foo<Real> {
    x: Real,
    y: Real,
}

// Add the algebra for Foo
impl<Real> Add for &'_ Foo<Real>
where
    for<'a> &'a Real: Add<Output = Real> + Mul<Output = Real>,
{
    type Output = Foo<Real>;

    fn add(self, other: Self) -> Self::Output {
        Foo {
            x: &self.x + &other.x,
            y: &self.y + &other.y,
        }
    }
}

impl<Real> Mul for &'_ Foo<Real>
where
    for<'a> &'a Real: Add<Output = Real> + Mul<Output = Real>,
{
    type Output = Foo<Real>;

    fn mul(self, other: Self) -> Self::Output {
        Foo {
            x: &self.x * &other.x,
            y: &self.y * &other.y,
        }
    }
}

// Compute a function on a slice of Reals
fn foo<Real>(x: &[Real]) -> Real
where
    for<'a> &'a Real: std::ops::Add<Output = Real> + std::ops::Mul<Output = Real>,
{
    &(&x[0] + &x[1]) * &x[2]
}

// Run foo on two different types
fn main() {
    let x = vec![1.2, 2.3, 3.4];
    let _x = foo::<f64>(&x);
    let y: Vec<Foo<f64>> = x.into_iter().map(|z| Foo { x: z, y: z + 1.0 }).collect();
    let _y = foo::<Foo<f64>>(&y);
}
wyer33
  • 6,060
  • 4
  • 23
  • 53
  • It looks like your question might be answered by the answers of [How do I implement the Add trait for a reference to a struct?](https://stackoverflow.com/q/28005134/155423). If not, please **[edit]** your question to explain the differences. Otherwise, we can mark this question as already answered. – Shepmaster Sep 11 '19 at 17:12
  • [The duplicate applied to your situation](https://play.rust-lang.org/?version=stable&mode=debug&edition=2018&gist=6135557efef81ce47ab2cca60f9133c0). – Shepmaster Sep 11 '19 at 17:15
  • 1
    It is particularly worth noting that, if something implements trait `T`, it does not follow automatically that trait `T` is also implemented for the reference version of the trait. Only in certain cases is that true. The duplicate suggestion by @Shepmaster covers the rest – Sébastien Renauld Sep 11 '19 at 17:15
  • @Shepmaster Thanks for the fix on the playground. Though this situation is similar to your link, it's slightly different due to the higher rank trait bound on `Real`, which was also throwing me off. Anyway, if you want to post that code as an answer, I'll accept it. If everyone truly thinks this is just a duplicate of multiple questions, we can close things out. – wyer33 Sep 11 '19 at 17:37
  • @wyer33 it's a combination of the question you already found (HRTB for the requirements on `Real`) and implementing the trait for a reference (the Q&A I added in a comment). Since SO allows for a question to be answered by multiple duplicates, that seems like a reasonable solution to me, but I'm always open to well-reasoned discussion to change my mind. – Shepmaster Sep 11 '19 at 17:40
  • We can leave it as a duplicate. I made one final edit with the working code in case anyone runs across the same difficulty. – wyer33 Sep 11 '19 at 17:49

0 Answers0