Though this has been discussed before here and here, I can't quite make a simple example work. In the code
// Create some random type that we want to represent as a Real
struct Foo<Real> {
x: Real,
y: Real,
}
// Add the algebra for Foo
impl<Real> std::ops::Add for Foo<Real>
where
Real: std::ops::Add<Output = Real> + std::ops::Mul<Output = Real>,
{
type Output = Self;
fn add(self, other: Self) -> Self::Output {
Foo {
x: self.x + other.x,
y: self.y + other.y,
}
}
}
impl<Real> std::ops::Mul for Foo<Real>
where
Real: std::ops::Add<Output = Real> + std::ops::Mul<Output = Real>,
{
type Output = Self;
fn mul(self, other: Self) -> Self::Output {
Foo {
x: self.x * other.x,
y: self.y * other.y,
}
}
}
// Compute a function on a slice of Reals
fn foo<Real>(x: &[Real]) -> Real
where
for<'a> &'a Real: std::ops::Add<Output = Real> + std::ops::Mul<Output = Real>,
{
&(&x[0] + &x[1]) * &x[2]
}
// Run foo on two different types
fn main() {
let x = vec![1.2, 2.3, 3.4];
let _x = foo::<f64>(&x);
let y: Vec<Foo<f64>> = x.into_iter().map(|z| Foo { x: z, y: z + 1.0 }).collect();
let _y = foo::<Foo<f64>>(&y);
}
We get the compiler error:
error[E0277]: cannot add `&'a Foo<f64>` to `&'a Foo<f64>`
--> src/main.rs:47:14
|
47 | let _y = foo::<Foo<f64>>(&y);
| ^^^^^^^^^^^^^^^ no implementation for `&'a Foo<f64> + &'a Foo<f64>`
|
= help: the trait `std::ops::Add` is not implemented for `&'a Foo<f64>`
note: required by `foo`
--> src/main.rs:35:1
|
35 | / fn foo<Real>(x: &[Real]) -> Real
36 | | where
37 | | for<'a> &'a Real: std::ops::Add<Output = Real> + std::ops::Mul<Output = Real>,
38 | | {
39 | | &(&x[0] + &x[1]) * &x[2]
40 | | }
| |_^
error[E0277]: cannot multiply `&'a Foo<f64>` to `&'a Foo<f64>`
--> src/main.rs:47:14
|
47 | let _y = foo::<Foo<f64>>(&y);
| ^^^^^^^^^^^^^^^ no implementation for `&'a Foo<f64> * &'a Foo<f64>`
|
= help: the trait `std::ops::Mul` is not implemented for `&'a Foo<f64>`
note: required by `foo`
--> src/main.rs:35:1
|
35 | / fn foo<Real>(x: &[Real]) -> Real
36 | | where
37 | | for<'a> &'a Real: std::ops::Add<Output = Real> + std::ops::Mul<Output = Real>,
38 | | {
39 | | &(&x[0] + &x[1]) * &x[2]
40 | | }
| |_^
This seems to imply that the definitions of Add
and Mul
are incorrect. Is there an easy way to fix the definitions? In case it makes a difference, it's my desire to have Add
and Mul
not take ownership of their arguments. Rather, I'd like to have each operator allocate new memory and then give ownership of that variable to the calling function.
Edit 1
As pointed out, this is a duplicate of the following two questions combined
How do I implement the Add trait for a reference to a struct?
How to write a trait bound for adding two references of a generic type?
For posterity's sake, here's a working code from @Shepmaster:
use std::ops::{Add, Mul};
// Create some random type that we want to represent as a Real
struct Foo<Real> {
x: Real,
y: Real,
}
// Add the algebra for Foo
impl<Real> Add for &'_ Foo<Real>
where
for<'a> &'a Real: Add<Output = Real> + Mul<Output = Real>,
{
type Output = Foo<Real>;
fn add(self, other: Self) -> Self::Output {
Foo {
x: &self.x + &other.x,
y: &self.y + &other.y,
}
}
}
impl<Real> Mul for &'_ Foo<Real>
where
for<'a> &'a Real: Add<Output = Real> + Mul<Output = Real>,
{
type Output = Foo<Real>;
fn mul(self, other: Self) -> Self::Output {
Foo {
x: &self.x * &other.x,
y: &self.y * &other.y,
}
}
}
// Compute a function on a slice of Reals
fn foo<Real>(x: &[Real]) -> Real
where
for<'a> &'a Real: std::ops::Add<Output = Real> + std::ops::Mul<Output = Real>,
{
&(&x[0] + &x[1]) * &x[2]
}
// Run foo on two different types
fn main() {
let x = vec![1.2, 2.3, 3.4];
let _x = foo::<f64>(&x);
let y: Vec<Foo<f64>> = x.into_iter().map(|z| Foo { x: z, y: z + 1.0 }).collect();
let _y = foo::<Foo<f64>>(&y);
}