3

While writing Haskell as a programmer that had exposure to Lisp before, something odd came to my attention, which I failed to understand.

This compiles fine:

{-# LANGUAGE NamedFieldPuns #-}
{-# LANGUAGE ExistentialQuantification #-}
data Foo = forall a. Show a => Foo { getFoo :: a }

showfoo :: Foo -> String
showfoo Foo{getFoo} = do
  show getFoo

whereas this fails:

{-# LANGUAGE NamedFieldPuns #-}
{-# LANGUAGE ExistentialQuantification #-}
data Foo = forall a. Show a => Foo { getFoo :: a }

showfoo :: Foo -> String
showfoo foo = do
  let Foo{getFoo} = foo
  show getFoo

To me it's not obvious why the second snippet fails.

The question would be:

Do I miss something or stems this behaviour from the fact that haskell is not homoiconic?

My reasoning is, given that:

  1. Haskell needs to implement record pattern matching as a compiler extension, because of it's choice to use syntax rather than data.

  2. Matching in a function head or in a let clause are two special cases.

It is difficult to understand those special cases, as they cannot be either implemented nor looked up directly in the language itself.

As an effect of this, consistent behaviour throughout the language is not guaranteed. Especially together with additional compiler extensions, as per example.

ps: compiler error:

error:
    • My brain just exploded
      I can't handle pattern bindings for existential or GADT data constructors.
      Instead, use a case-expression, or do-notation, to unpack the constructor.
    • In the pattern: Foo {getFoo}
      In a pattern binding: Foo {getFoo} = foo
      In the expression:
        do { let Foo {getFoo} = foo;
             show getFoo }

edit: A different compiler version gives this error for the same problem

* Couldn't match expected type `p' with actual type `a'
    because type variable `a' would escape its scope
  This (rigid, skolem) type variable is bound by
    a pattern with constructor: Foo :: forall a. Show a => a -> Foo
zabeltech
  • 963
  • 11
  • 27
  • [Please don’t add tags to your question that your question mentions, but is not about.](https://meta.stackexchange.com/a/18879/188673) This question is about Haskell, not Lisp. – Alexis King Nov 16 '18 at 23:22
  • 4
    Spoilers: "homoiconic" doesn't actually mean anything. – melpomene Nov 16 '18 at 23:25

2 Answers2

13

Do I miss something or stems this behaviour from the fact that haskell is not homoiconic?

No. Homoiconicity is a red herring: every language is homoiconic with its source text and its AST1, and indeed, Haskell is implemented internally as a series of desugaring passes between various intermediate languages.

The real problem is that let...in and case...of just have fundamentally different semantics, which is intentional. Pattern-matching with case...of is strict, in the sense that it forces the evaluation of the scrutinee in order to choose which RHS to evaluate, but pattern bindings in a let...in form are lazy. In that sense, let p = e1 in e2 is actually most similar to case e1 of ~p -> e2 (note the lazy pattern match using ~!), which produces a similar, albeit distinct, error message:

ghci> case undefined of { ~Foo{getFoo} -> show getFoo }

<interactive>:5:22: error:
    • An existential or GADT data constructor cannot be used
        inside a lazy (~) pattern
    • In the pattern: Foo {getFoo}
      In the pattern: ~Foo {getFoo}
      In a case alternative: ~Foo {getFoo} -> show getFoo

This is explained in more detail in the answer to Odd ghc error message, "My brain just exploded"?.


1If this doesn’t satisfy you, note that Haskell is homoiconic in the sense that most Lispers use the word, since it supports an analog to Lisp’s quote operator in the form of [| ... |] quotation brackets, which are part of Template Haskell.

Alexis King
  • 43,109
  • 15
  • 131
  • 205
  • your definition of homoiconicity does not address my point .. mk, more pressing is this: it might be that i have a limited understanding here: why does it not compile if i make the pattern binding strict? – zabeltech Nov 17 '18 at 01:03
  • @zabeltech Perhaps because it simply wasn’t implemented because it’s a rather uncommon use case. You could certainly file an issue on the [GHC trac](https://ghc.haskell.org/trac/ghc) requesting it be implemented. – Alexis King Nov 17 '18 at 01:07
  • @zabeltech In any case, “homoiconicity” [doesn’t mean anything](https://twitter.com/ShriramKMurthi/status/1046944950531829760), and even if it did, it [isn’t the point](http://calculist.org/blog/2012/04/17/homoiconicity-isnt-the-point/). Even in a Lisp, you wouldn’t want errors involving `let` reported in terms of `case`. Two language features having similar but distinct semantics and being kept separate, rather than desugaring one into the other, is a choice completely orthogonal to homoiconicity, and many features in GHC *are* defined by desugaring (such as `do` notation and `LambdaCase`). – Alexis King Nov 17 '18 at 01:12
  • hmm, ja, might be that nobody bothered...again i'am a big haskell fan. but it seems off that the evaluation strategy affects the type checker, although you have proven that it does ... do you see a common sensical reason for that? – zabeltech Nov 17 '18 at 01:39
  • 4
    @zabeltech Evaluation strategy has to be relevant to the typechecker once you start allowing values to serve as witnesses for type-level truths (which is precisely what GADTs, and to a lesser extent, existentials, do), since Haskell’s implementation of laziness makes every type inhabited. Indeed, I just watched [a talk that demonstrates how important evaluation order is to static analysis](https://www.youtube.com/watch?v=nDmNTRG1V_0) in the context of Haskell, which could be interesting to you if you want to have a better understanding of what can go wrong when you do not consider laziness. – Alexis King Nov 17 '18 at 01:43
  • i believe that what you say reagarding lazy patterns is true, however i think in this case the bigger point about the problem of the code is about the hidden type variable a escaping it's scope. this gets reflected by the error i get with a different compiler version – zabeltech Nov 19 '18 at 09:18
  • @zabeltech I don’t think the answer you have accepted is correct, as I’ve explained in [my comment on that answer](https://stackoverflow.com/questions/53346318/pattern-bindings-for-existential-constructors/53356406#comment93631075_53356406). – Alexis King Nov 19 '18 at 20:27
0

I thought about this a bit and albeit the behaviour seems odd at first, after some thinking I guess one can justify it perhaps thus:

Say I take your second (failing) example and after some massaging and value replacements I reduce it to this:

data Foo = forall a. Show a => Foo { getFoo :: a }

main::IO()
main = do
    let Foo x = Foo (5::Int)
    putStrLn $ show x

which produces the error:

Couldn't match expected type ‘p’ with actual type ‘a’ because type variable ‘a’ would escape its scope

if the pattern matching would be allowed, what would be the type of x? well.. the type would be of course Int. However the definition of Foo says that the type of the getFoo field is any type that is an instance of Show. An Int is an instance of Show, but it is not any type.. it is a specific one.. in this regard, the actual specific type of the value wrapped in that Foo would become "visible" (i.e. escape) and thus violate our explicit guarantee that forall a . Show a =>...

If we now look at a version of the code that works by using a pattern match in the function declaration:

data Foo = forall a . Show a => Foo { getFoo :: !a }

unfoo :: Foo -> String
unfoo Foo{..} = show getFoo

main :: IO ()
main = do
    putStrLn . unfoo $ Foo (5::Int)

Looking at the unfoo function we see that there is nothing there saying that the type inside of the Foo is any specific type.. (an Int or otherwise) .. in the scope of that function all we have is the original guarantee that getFoo can be of any type which is an instance of Show. The actual type of the wrapped value remains hidden and unknowable so there are no violations of any type guarantees and happiness ensues.

PS: I forgot to mention that the Int bit was of course an example.. in your case, the type of the getFoo field inside of the foo value is of type a but this is a specific (non existential) type to which GHC's type inference is referring to (and not the existential a in the type declaration).. I just came up with an example with a specific Int type so that it would be easier and more intuitive to understand.

Erick Gonzalez
  • 234
  • 1
  • 5
  • I think this answer is incorrect, or at least the code example you included is wrong. When [I run your program](https://repl.it/repls/MountainousThreadbareDeprecatedsoftware), I don’t get the “type variable would escape its scope” error, I get the “my brain just exploded” error that occurs from an attempt to match on an existential in a pattern binding. Furthermore, I don’t think your explanation makes sense—why do you think `x` should have type `Int`? Constructing an existential “seals away” the type of its argument; opening it will not get that type back. `x` should be a fresh, rigid type. – Alexis King Nov 19 '18 at 15:25
  • Well .. (edited for brevity) `$ cat app/Main.hs {-# LANGUAGE ExistentialQuantification, RecordWildCards #-} module Main where data Foo = forall a. Show a => Foo { getFoo :: a } main::IO() main = do let Foo x = Foo (5::Int) putStrLn $ show x $ stack build /Users/erick/sandbox/atest/app/Main.hs:8:13: error: • Couldn't match expected type ‘p’ with actual type ‘a’ because type variable ‘a’ would escape its scope` – Erick Gonzalez Nov 20 '18 at 16:32
  • Sorry for the poor formatting above but it seems one can't use blocks in comments.. either way I think you get the point. You get one result, I get another... FWIW I am using GHC 8.2.2 Now, I believe you didn't understand my point. What I saying is that the following statement: `let Foo x = Foo (5::Int)` as you say `x` can not be (specifically) an Int... but the let statement above pretty much states that x would have to be an int, thus violating the existential constraint.. – Erick Gonzalez Nov 20 '18 at 16:40
  • 1
    Upon reading your reply again I believe I understand what you meant with "why do you think `x` should have type `Int`"? ... well, because of referential transparency. If the thing on the left and the right side of the `=` can be used interchangeably anywhere in the program, it follows that if one matches `Foo x` to `Foo (5::Int)` then I should be able to replace any occurrence of x with 5 after this.. ergo they would be the same thing..and the same thing has (of course) the same type.. the compiler can't just "look the other way" and pretend it doesn't know in this case that x is an `Int` – Erick Gonzalez Nov 20 '18 at 16:58