0

As per this answer, consider a function which accepts a Vec/slice of String/&str:

fn foo<S: AsRef<str>>(list: &[S]) {
    for s in list.iter() {
        println!("{}", s.as_ref());
    }
}

fn main() {
    foo(&["abc", "def"]);
    foo(&["abc", &format!("def {}", 99)]);
    foo(&[&format!("def {}", 99), "abc"]); // error
}

In the last call, I receive the following error:

error[E0308]: mismatched types
  --> src/main.rs:10:9
   |
10 |     foo(&[&format!("def {}", 99), "abc"]);
   |         ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ expected slice, found array `[&str; 2]`
   |
   = note: expected reference `&[&String]`
              found reference `&[&str; 2]`

In order to have format! as the first element, I can't use references, but owned values instead:

foo(&[format!("def {}", 99), "abc".to_owned()]);

Why?

Shepmaster
  • 388,571
  • 95
  • 1,107
  • 1,366
rodrigocfd
  • 6,450
  • 6
  • 34
  • 68
  • 1
    You can cast the `&String` to a `&str`, e.g. `&[&format!("def {}", 99) as &str, "abc"]` or `&[format!("def {}", 99).as_str(), "abc"])`. – Shepmaster Jun 29 '21 at 18:52
  • You could also force the deref coercion with`&*format!("def {}", 99)`. – Aiden4 Jun 29 '21 at 18:58
  • 3
    I can't find the related duplicate answer now, but the TL;DR is that the compiler infers the type of the array based on the *first* element. The working cases it's `&str`, the failing cases it's `&String`. A `&String` can be coerced to a `&str`, but not the other way around. It also won't "backtrack" to change the `&String` to a `&str` and try everything again. – Shepmaster Jun 29 '21 at 19:11
  • 1
    @Shepmaster That makes perfect sense, and it indeed works. Please make it an answer so I can mark it. – rodrigocfd Jun 29 '21 at 19:13

0 Answers0