-1

When I read chapter18.1 of the book, I found that I couldn't understand this piece of code in Listing 18-1:

let age: Result<u8, _> = "34".parse();

It converts "34" into Result::Ok(u8), but if the string literal can't convert into Ok(u8), what Error type it will convert into? What is the purpose of a underscore pattern in the angle brackets of generic types? How can compiler confirm the type of underscore?

This is my try:

fn main() {
    let age: Result<u8, _> = “abc114514”.parse();
    println(“{:?}”, age);
}

Program running results:

Err(ParseIntError { kind: InvalidDigit })
mkrieger1
  • 19,194
  • 5
  • 54
  • 65
Tri.w.j
  • 1
  • 2
  • `<_>` is used to tell the compiler to infer the type. It can be used here because `.parse()` to an integer can only fail with a `ParseIntError` – Phydeaux Aug 04 '23 at 11:23

1 Answers1

0

You first need to know what type inference is. Type inference can be convenient, but there are certain limitations to the ability of the compiler to infer a type.

When you write Result<u8, _>, you are essentially giving the compiler partial information as to what the type is. In many cases that's like the required minimum of information for the compiler in order for it to infer the exact type, without you having to type it in its entirety.

at54321
  • 8,726
  • 26
  • 46