29

I'm in the process of learning Haskell, and type classes seem like a powerful way to make type-safe polymorphic functions. But a lot of the Haskell Prelude functions don't use them. More specifically:

  • Most of the list functions don't work with other data structures (for instance, foldr and length are only implemented for lists and can't be used on arrays).

  • Modules like Data.ByteString are unusable unless you use import qualified since they include functions that have the same names as Prelude functions.

It seems like both of these problems would go away if the standard library used generic functions with type classes (please let me know if I'm totally off base with this).

I have two questions:

  1. Are there technical or design reasons that the Prelude is like this, or is it just for historical reasons?

  2. Looking around, it looks like there are a couple of libraries (like Data.Foldable and, if I'm not mistaken, Scrap Your Boilerplate) that replace the standard Prelude functions with generic alternatives. Are there any plans to incorporate these ideas into future versions of Haskell?

shosti
  • 7,332
  • 4
  • 37
  • 42
  • 1
    No idea but there are plenty of things that you could do before this like making the Monad class depend on Appliciative. And using 'pure' or 'unit' instead of 'return'. Getting rid of 'map' and only ever use 'fmap' would be an option too but I just do not think that it will happen. Too many things would break if they did. – Robert Massaioli Jan 24 '11 at 05:54
  • 1
    I think it would be time for more polymorphism, for some corrections in type classes (like the Monad / Applicative fix mentioned by Robert) and for making some compiler extensions standard. You can get only so far with evolution. It would be important for the language to make a clear cut and fix most of gaps and errors of the past. I think the transition doesn't have to be a shock therapy but can be made reasonable smooth (see e.g. Python 2 to 3) – Landei Jan 24 '11 at 09:51

4 Answers4

19

There is a very good pragmatic reason that "standard" Haskell (Prelude + base + maybe some more) doesn't use more polymorphism:

Designing general-use type classes is hard. Good designs for classes that abstract over container types like lists, arrays and "bytestrings" (personally I don't really consider Bytestring a container) aren't floating round waiting to be included in Haskell 2012. There are some designs e.g Listlike and the Edison classes, and a number of people have chipped away at the problem but excepting Foldable and Traversable no-one has produced any compelling designs.

stephen tetley
  • 4,465
  • 16
  • 18
  • They could just use the ones already available in other languages, like the C++ STL. – Puppy Jan 24 '11 at 11:55
  • 11
    The STL is heavily engineered towards mutable structures and pointers. And even for that, it exposes a great deal of implementation. – sclv Jan 24 '11 at 13:05
11

The Haskell base library used to be more polymorphic - list comprehensions used to work for any monad, map and ++ weren't limited to List, and perhaps other things.

But folks at the time thought that it led to confusing error messages for beginners and that folks who aren't beginners can use the specifically polymorphic versions.

Antoine Latter
  • 1,545
  • 10
  • 13
  • +1, excellent point. Nobody ever complains that GHC's error messages are easy to understand, especially when polymorphism and "infinite types" are involved. – John L Jan 25 '11 at 11:14
  • 2
    Here's a quote from Simon Peyton Jones on why he thought monad comprehensions were removed from Haskell: http://hackage.haskell.org/trac/ghc/ticket/4370#comment:1 But he isn't really making strong statement. Sorry, I don't have anything else handy at the moment. – Antoine Latter Mar 05 '11 at 20:36
8
  1. While there are many things in base, and specifically Prelude, that are historic I think any generalization would see plenty of technical push-back. The main issue is speed - if you're function has a type class constraint then you're going to be passing around a dictionary for the type class functions and maybe eating more space for specialization.

  2. Some of the libraries, such as SYB, use extensions that aren't part of Haskell. The first task would be to formalize and build support for these features. Look at the Haskell' documents to see where Haskell is going and how you might be able to influence that path.

Thomas M. DuBuisson
  • 64,245
  • 7
  • 109
  • 166
  • 3
    Beh, GHC has a good specializer, speed is not really an issue. List is the "iterator" data structure, means the same thing as a foreach in imperative languages -- if you can convert to a list and then do a list op to get what you need, there is no reason to make it typeclass polymorphic. OTOH there are some operations that ought to be more polymorphic, eg. `zip` (cf. `ZipList`). – luqui Jan 24 '11 at 07:01
  • @luqui: So the idiomatic way to fold over an array (for example) would be to transform it to a list, fold over the list, and then transform it back to an array? I guess that makes sense, although it would still be nice to be able to fold over things like trees. And other list functions, like `length`, have nothing to do with iterating. – shosti Jan 24 '11 at 17:05
  • 1
    @eman, except you don't transform back into an array because fold doesn't return a list, it returns a summary value. See the standard typeclasses `Foldable` and `Traversable` for things that can be folded and iterated, respectively. – luqui Jan 24 '11 at 17:18
  • Why would you need to pass around a dictionary at runtime for functions with typeclass constraints? Wouldn't they be elided at compile-time? – Bill Jan 26 '11 at 23:07
  • 1
    @Bill not always. To eliminate the dictionaries compilers usually have to do function specialization (lots of papers on that) which can bloat the amount of code. As Mark Jones showed long ago, [this isn't often the case](http://web.cecs.pdx.edu/~mpj/pubs/pepm94.pdf). Another issue is it tends to cause headaches for separate compilation (so do whole program compilation, damn it!). – Thomas M. DuBuisson Jan 27 '11 at 00:17
6

Real World Haskell has some insights about this in the Monad Transformers chapter:

In an ideal world, would we make a break from the past, and switch over Prelude to use Traversable and Foldable types? Probably not. Learning Haskell is already a stimulating enough adventure for newcomers. The Foldable and Traversable abstractions are easy to pick up when we already understand functors and monads, but they would put early learners on too pure a diet of abstraction. For teaching the language, it's good that map operates on lists, not on functors.

Nefrubyr
  • 6,936
  • 2
  • 29
  • 20
  • 11
    I tend to disagree. E.g. the extended for loop in Java works for arrays, lists, sets etc, and I **never** heard a newbie complain about that. The reaction was often exactly the opposite: "Cool, I can change my own classes that it works for them, too!" So making map polymorphic makes it not harder to use it, but may stimulate experiments to make self-made data structures "mappable". – Landei Jan 24 '11 at 14:02
  • 6
    Yeah, I have to say, of all the tricky things in Haskell, generic functions are not high on the list (all the numeric functions are generic, and nobody has any trouble with those). The `import qualified` s and obscure prefixes are much more off-putting, especially coming from an OO background. – shosti Jan 24 '11 at 16:59
  • The numeric classes have special language support, see the keyword `default` in the language definition. That is possible because there is pretty good understanding when to disambiguate a literal to a concrete numeric type. – jmg Jan 28 '11 at 07:50