2

I've been using destructuring to create references into nested data, just for practice. I created a method that uses destructuring to break apart a borrowed tuple:

fn print_strings((x, y): &(String, String)) {
    println!("x: {}, y: {}", x, y);
}

fn main() {
    print_strings(&("foo".to_string(), "bar".to_string()));
}

If I add let () = x to the first line of the print_strings() function, the compiler tells me that x has type &String. This led me to believe that this is an instance of deref coercion, that the compiler was translating the &(String, String) from main() into a (&String, &String) to be used in print_strings().

But then I looked up the documentation for tuple and didn't find an implementation of Deref. So my question is, how does the compiler expand this code?

My hypothesis is that something like this happens:

fn print_strings(a: &(String, String)) {
    let x = &(a.0);
    let y = &(a.1);
    println!("x: {}, y: {}", x, y);
}

EDIT:

I suppose this can't be deref coercion because the signature of the function matches the arguments being passed in by main()

C-RAD
  • 1,052
  • 9
  • 18
  • 4
    Not able to answer right now, but the proper search term is probably "match ergonomics". In this case, `(x, y): &(String, String)` seems to be sugar for `&(ref x, ref y): &(String, String)`. – Cerberus Feb 16 '22 at 19:02

1 Answers1

0

Since you are passing to print_strings() a reference to a tuple, it's not possible to get a String in it. So when accessing an individual element of that tuple in print_strings(), a &String is pretty much the only thing it would make sense to get. I believe that sort of destructuring is handled by the compiler itself.

at54321
  • 8,726
  • 26
  • 46