2

When I test the type of literal in GHCi, I find

Prelude> :t 1
1 :: Num p => p

Prelude> :t 'c'    
'c' :: Char

Prelude> :t "string"    
"string" :: [Char]

Prelude> :t 1.0
1.0 :: Fractional p => p

The problem is how does Haskell to determine the type of such literal? where can I find the information about that?

Furthermore, Are there exist any way to change the way of GHC to interpret the type of literal?

For example:

 -- do something

 :t 1
 1 :: Int      -- interprets 1 as Int rather then Num p => p

 :t 1.0
 1.0 :: Double -- interprets 1.0 as Double rather then Fractional p => p 

Thanks in advance.

assembly.jc
  • 2,056
  • 1
  • 7
  • 16
  • 1
    https://stackoverflow.com/questions/34974872/what-does-num-a-a-mean-in-haskell-type-system not what you asked but somewhat helpful – blueheart Oct 12 '18 at 03:08
  • 1
    Note that the types of literals are exactly what you got above: some of these are polymorphic values, which will fit any suitable type. If `1` was an `Int`, then we couldn't do `x+1` when `x::Double`, and we would need to resort to an explicit numeric conversion (there is no "numeric promotion" in Haskell). If you really want an int `1`, use `(1 :: Int)` instead; otherwise, let `1` adapt to the type required by the context. – chi Oct 12 '18 at 08:19

2 Answers2

5

You can ask ghci to default the type variables:

$ ghci
λ> let x = 3
λ> :type x
x :: Num p => p
λ> :type +d x
x :: Integer
λ> :type +d 1
1 :: Integer
λ> :type +d 1.0
1.0 :: Double

The :type +d will make ghci to chose the default types for the type variables. Also, this is the general Haskell defaulting rule:

default Num Integer
default Real Integer
default Enum Integer
default Integral Integer
default Fractional Double
default RealFrac Double
default Floating Double
default RealFloat Double

You can learn more about it here.

Sibi
  • 47,472
  • 16
  • 95
  • 163
5

If you write 1, it has any possible number type. That's what Num p => p actually means.

If you use 1 in an expression, GHCi will attempt to figure out the correct type of number to use based on what functions you're calling on it, and then automatically give 1 the right type.

If GHCi cannot guess what the correct type is (because there's not enough context or because several types would fit), it defaults to Integer. (And for 1.0 it will default to Double. And for any other type constraint, it will try to default to () if possible.)

This is similar to how compiled code works. If you write a number in your source code, GHC (the compiler) will attempt to auto-detect what the correct type should be. The difference is, if the compiler can't figure it out, it won't "guess" or "default", it'll just give you a compile-time error and demand that you specify what you mean. That's desirable to make compiled code work how you expected, but it's tedious for interactively trying stuff out, which is why GHCi has defaulting.

The type of a single character is always Char.

The type of a string is always String or [Char]. (One is an alias to the other.)

The type of True and False is always Bool. And so on.

So it's only really numbers that have the possibility of multiple types.

[Well, there's an option to make strings polymorphic too, but we won't worry about that now...]

If you want messy details, you can read the Haskell Language Report (which is the official specification document that defines the Haskell language) and the GHCi user manual (which describes what GHCi does).

MathematicalOrchid
  • 61,854
  • 19
  • 123
  • 220