Consider following code
fn main() {
let s = (&&0,);
let (x,) = s; // &&i32
let (&y,) = s; // &i32
let (&&z,) = s; // i32
let t = &(&0,);
let (x,) = t; // &&i32
let (&y,) = t; // i32
let u = &&(0,);
let (x,) = u; // &i32
let (&y,) = u; // mismatched types expected type `{integer}` found reference `&_`
}
Could someone explain, why &
pattern behaves differently in every case? I suppose it is tied somehow to match ergonomics, maybe some coercions come into play? But I can't wrap my head around it.