2

I have From implementations to create my type, but something weird happens:

impl From<i64> for Object {
    #[inline]
    fn from(i: i64) -> Self {
        Object {
            i: ((((i << 16) as usize) >> 13) + NAN_VALUE as usize - 1 + classSmallInteger) as i64,
        }
    }
}
impl From<u64> for Object {
    #[inline]
    fn from(u: u64) -> Self {
        Object {
            i: (((u as usize) << 16 >> 13) + NAN_VALUE as usize - 1 + classSmallInteger) as i64,
        }
    }
}
impl From<i32> for Object {
    #[inline]
    fn from(i: i32) -> Self {
        Object {
            i: ((((i << 16) as usize) >> 13) + NAN_VALUE as usize - 1 + classSmallInteger) as i64,
        }
    }
}

I originally had just the first one, and Object::from(42) worked fine. Then I wanted to add a conversion from u64, so I added the second definition. Then Object::from(42_u64) worked, but all the places where I had literals like 42 turned into i32s, and I got errors, so I added the third case, and Object::from(42) worked again, but was being interpreted as i32, and still anything like 1<<35 gave overflow errors, because the 1 was being interpreted as i32.

I don't understand why adding the second definition suddenly made all my constants default to i32, whereas they had previously been i64.

Is there a place to change the default to be i64?

Stargateur
  • 24,473
  • 8
  • 65
  • 91
Dave Mason
  • 669
  • 6
  • 15
  • 5
    Rust's behavior is to (a) match the expected type when it's unambiguous, or (b) default to `i32` otherwise. When you added the second `impl`, it introduced ambiguity which resulted in behavior (b). – Lambda Fairy May 27 '21 at 03:16
  • 2
    Does this answer your question? [What is the default integer type in Rust?](https://stackoverflow.com/questions/55903243/what-is-the-default-integer-type-in-rust) – Stargateur May 27 '21 at 03:52
  • @LambdaFairy no is ambiguity rust compile error, that just the rule integer default to i32 when unconstrained. A disputable rust if you ask me. – Stargateur May 27 '21 at 03:54
  • 1
    Thanks @LambdaFairy and Stargateur I now understand that the ambiguity causes a fallback. I think it would be better if the compiler explicitly stated that it was ambiguous. – Dave Mason May 27 '21 at 15:14
  • No worries @DaveMason. Feel free to file an issue, or ping ekuber on Twitter -- unhelpful diagnostics are considered a bug. – Lambda Fairy May 31 '21 at 00:58

1 Answers1

2

As @LambdaFairy and @Stargateur pointed out, this addition of the second definition makes the integer constants inference ambiguous. Previously they were inferred to be i64, but once the ambiguity enters the scene, they fall back to i32 as per RFC212.

It would be nice to be told by the compiler that the inference was ambiguous, but at least I understand now.

Dave Mason
  • 669
  • 6
  • 15