I'm having some difficulty with understanding why the inferred type signature is different from what I would expect. Let's have an example (I tried to make it as short as possible):
import Control.Applicative
import Data.Word
import Text.ParserCombinators.Parsec
import Text.ParserCombinators.Parsec.Token
import Text.Parsec.Language (emptyDef)
import Text.Parsec.Prim
import Data.Functor.Identity
--parseUInt' :: Num b => ParsecT String u Identity b
parseUInt' = fromInteger <$> decimal (makeTokenParser emptyDef)
--parseUInt1 = fromInteger <$> decimal (makeTokenParser emptyDef)
--parseUInt2 = fromInteger <$> decimal (makeTokenParser emptyDef)
parsePairOfInts = do
x <- parseUInt'
char ','
y <- parseUInt'
return $ (x, y)
parseLine :: String -> Either ParseError (Word32, Word8)
parseLine = parse parsePairOfInts "(error)"
main = print . show $ parseLine "1,2"
This code does NOT compile:
test.hs:21:19:
Couldn't match type ‘Word32’ with ‘Word8’
Expected type: Parsec String () (Word32, Word8)
Actual type: ParsecT String () Identity (Word32, Word32)
In the first argument of ‘parse’, namely ‘parsePairOfInts’
In the expression: parse parsePairOfInts "(error)"
Failed, modules loaded: none.
But if I uncomment the type signature of parseUInt'
it compiles just fine.
At the same time, if I query type information in GHCi, it looks like this:
λ>:t (fromInteger <$> decimal (makeTokenParser emptyDef))
(fromInteger <$> decimal (makeTokenParser emptyDef))
:: Num b => ParsecT String u Identity b
But if I do NOT specify the type signature explicitly, the 'b' type is fixed to Word32
somehow.
If I replace parseUInt'
with two different (but still the same implementation) functions parseUInt1
and parseUInt2
, the code compile too.
I thought that if I don't specify a function's type, the inferred type must be the least restrictive (Num b =>...
) but it's not the case somehow.
What I'm really missing here?