I've heard from many Pythonists that they prefer list comprehensions because they can do everything you can do using high order functions such as filter and reduce, and more. So this question address them: what is a solid example of something you can do with them, that is tricky to do with HOFs?
-
9list comprehensions can't emulate `reduce` – jamylak May 28 '13 at 05:02
-
@hammar you're right, this is actually vague. I fixed the title. – MaiaVictor May 28 '13 at 05:05
-
I'm voting to close this, the question is very vague and also incorrect – jamylak May 28 '13 at 05:05
-
@jamylak without explaining what you think is incorrect? – MaiaVictor May 28 '13 at 05:06
-
2Is the question backwards? Don't you mean easy with higher-order function but tricky with list comprehensions? The introduction of your question seems to suggest the exact opposite question. – Gabriella Gonzalez May 28 '13 at 05:06
-
@GabrielGonzalez hmm it seems OK for me, what part exactly is confusing you? (But no, I mean easy with comprehensions and hard with HOFs, as this is what many Pythonists say [1]) – MaiaVictor May 28 '13 at 05:07
-
So in other words, you want an example to confirm what the Pythonists have been saying? – Gabriella Gonzalez May 28 '13 at 05:08
-
I don't agree with their position so I'm asking for a concrete elaboration on their part. – MaiaVictor May 28 '13 at 05:10
-
1This question is off topic for this site. What is your specific problem? – John La Rooy May 28 '13 at 05:12
-
10@Viclib, most Pythonists are beginners. You're better off paying attention to what the experts say. – John La Rooy May 28 '13 at 05:13
-
1@gribbler just got 2 great answers so I think the problem is clear enough. If you have any specific question I'll be glad to answer. Also, I agree with you. – MaiaVictor May 28 '13 at 05:16
-
4I don't understand SO sometimes. People send their times writing great answers. Other people vote positively on those answers - 13 votes already - which means the question was useful to the community. Yet nobody cares to cast as single vote to the question, and then it gets closed... – MaiaVictor May 28 '13 at 08:11
-
2@Viclib You can have my vote to reopen and an apology since I voted to close. It was not evident from your question on first read that good answers would follow. It seemed to border on baiting, but perhaps I was too hasty. – A. Webb May 28 '13 at 08:38
-
1@A.Webb thanks, that's really nice. – MaiaVictor May 28 '13 at 08:48
-
1@Viclib The answers were useful to the community. Imagine someone asking "How does sorting work?" and someone giving a complete, diagram-filled, well-explained introduction to sorting using various algorithms and data structures. The presence of an awesome answer doesn't imply it was an awesome question. That said, SE does try to encourage more voting on questions with a badge or two. – AndrewC May 28 '13 at 08:56
-
@AndrewC in defense of the question, while "how sorting works" has been asked a billion times, this specific question was not asked here yet. It gave the chance to some people to talk about something that interested them, thus great answers were born. I was expecting that. – MaiaVictor May 28 '13 at 09:16
-
But I'm completely fine with your negative judgement, it's your right and duty! The problem comes when people don't vote at all, negatively or positively. 2 votes for a question with such momentum is unhealthy. I just asked a question other day that got -5 votes. I didn't complain at all, just apologized and edited it to fit better. This is fine. Not voting is not! – MaiaVictor May 28 '13 at 09:17
-
I didn't downvote. I was using an extreme example to explain that the quality of questions and the answers are not intrinsically linked. Your question is much better than "How does sorting work?", but not as good as [Which part of Hindey-Milner do you not understand?](http://stackoverflow.com/questions/12532552/what-part-of-milner-hindley-do-you-not-understand), to take two rather extreme examples. There are three valid choices: +1, +0, -1. I don't think you should take the voting so personally. – AndrewC May 28 '13 at 13:15
-
Not personally at all. Just trying to keep the flow of the topic. – MaiaVictor May 29 '13 at 00:17
6 Answers
The answer is that there is no such example. Everything you can do with list comprehensions has a mechanical translation to higher-order functions. In fact, this is how Haskell implements list comprehensions: it desugars them to higher-order functions.
Given a list comprehension like this:
[(x, y) | x <- [1..3], y <- [4..6]]
Haskell desugars it to:
concatMap (\x -> concatMap (\y -> [(x, y)]) [4..6]) [1..3]
Similarly, if you put in predicates like:
[(x, y) | x <- [1..3], y <- [4..6], x + y /= 5]
... then that desugars to:
concatMap (\x -> concatMap (\y -> if (x + y) == 5 then [(x, y)] else []) [4..6]) [1..3]
In fact, this desugaring is part of the Haskell specification, which you can find here.

- 34,863
- 3
- 77
- 135
-
2I think your `concatMap` code is nearly impossible to comprehend at first glance - with all the parentheses counting and balancing... For instance, you have an extra unbalanced parenthesis there. No such dangers and inconveniences with the LC syntax. :) – Will Ness May 28 '13 at 12:23
-
@WillNess Yes, and this is why we have the list monad (like list comprehensions) to sweeten the syntax. – Gabriella Gonzalez May 28 '13 at 14:08
-
1Note that GHC is actually a bit more clever in how it does the actual desugaring. You'll often find that list comprehensions end up performing better than a manually desugared version using `concatMap` or than equivalent code using the list monad. – hammar May 28 '13 at 18:21
-
3@hammer That's right. `ghc` desugaring avoids many of the common pitfalls of the naive desugaring, such as producing code optimized for `build/foldr` deforestation and also avoiding unnecessary use of singleton lists. – Gabriella Gonzalez May 28 '13 at 18:57
-
I [know](http://www.reddit.com/r/haskell/comments/38kl9z/thoughts_on_xlambdado_syntax/crw9y4r?context=3) that pattern-matches can be written in a list comprehension in Haskell, and if it is a non-total match, the non-matching cases will be silently thrown away. Example: `[ y | Just y <- xs]` (I believe it is called `catMaybes xs`). But if it is translated into `concatMap (\Just y -> [y]) xs`, then wouldn't that crash on `Nothing`s? – imz -- Ivan Zakharyaschev Jun 17 '15 at 13:54
-
1@imz--IvanZakharyaschev See section 3.11 of the Haskell Report: https://www.haskell.org/onlinereport/exps.html . There is a rule that says that `[ e | p <- l, Q ] = let ok p = [ e | Q ]; ok _ = [] in concatMap ok l`, where `p` is a pattern (i.e. `Just y`), so if the pattern does not match then you will get the empty list – Gabriella Gonzalez Jun 17 '15 at 18:42
As has been said, everything you can do with list comprehensions can be desugared into higher-order functions, but a large part of the problem with doing this in Python is that Python lacks support for the kind of point-free programming you can use with filter
, map
, and friends in Haskell. Here's a somewhat contrived example, but I think you'll get the idea.
Let's take this Python code:
[(x,y) for x,y in zip(xrange(20), xrange(20, 0, -1)) if x % 2 == 0 and y % 2 == 0]
All it does is print this out:
[(0, 20), (2, 18), (4, 16), (6, 14), (8, 12), (10, 10), (12, 8), (14, 6), (16, 4), (18, 2)]
Here's the equivalent version with filter:
filter(lambda ns : ns[0] % 2 == 0 and ns[1] % 2 == 0, zip(xrange(20), xrange(20, 0, -1)))
I hope you'll agree with me that it's a lot uglier. There isn't really much you can do to make it less ugly without defining a separate function.
But let's look at the equivalent version in Haskell:
[(x,y) | (x,y) <- zip [0..20] [20,19..0], x `mod` 2 == 0 && y `mod` 2 == 0]
Okay, pretty much as good as the Python list comprehension version. What about the equivalent filter version?
import Data.Function
let f = (&&) `on` (==0) . (`mod` 2)
filter (uncurry f) $ zip [0..20] [20,19..0]
Okay, we had to do an import, but the code is (imo) a lot clearer once you understand what it does, although some people might still prefer f
to be pointed, or even a lambda with filter. In my opinion the point-free version is more concise and conceptually clear. But the main point I want to make is that it is not really going to be this clear in Python because of the inability to partially apply functions without bringing in a separate library, and the lack of a composition operator, so in Python it is a good idea to prefer list comprehensions over map/filter, but in Haskell it can go either way depending on the specific problem.

- 2,100
- 1
- 17
- 31
-
1Oh god, what an awesome answer. I wish I could tick you but I have already ticked. Also, what's up with you guys... you spend your time writing great answers, which means you think it's a good question, yet don't care to cast a single vote on it... then it gets closed. Well, SO is weird sometimes. – MaiaVictor May 28 '13 at 08:09
-
-
1Not convinced your Haskell `filter` version is better, since you still had to define a separate function, which you treated as a last resort when discussing improving the Python `filter` version. :) In this case, I think a pointed version of `f` would be clearer (and avoid the import). – Mark Reed May 28 '13 at 11:52
-
3For me quite the opposite, your Haskell LC example is immediately readable (from left to right, all knowledge local), and its functional equivalent is at the first glance just *opaque* (can't read right away, must first absorb the `f` definition, which is detached from its use location, so no locality; must *think* about `uncurry`... even `filter (\(x,y)-> f x y) ...` would be much easier to read). *"a lot clearer **once you understand what it does"*** is a self-contradiction. :) So I take it as evidence to the contrary. :) – Will Ness May 28 '13 at 12:18
-
@MarkReed Yes, I agree that some people might prefer the pointed version quite a bit more. My main point though, was that the list comprehension versions in both languages were about as equally clear, but using HOFs in Python is going to be more difficult to achieve clearly because of Haskell's support for things like automatic currying, the composition operator, and other tools. Also I think `Data.Function.on` should really be included in Prelude fwiw. – Wes May 28 '13 at 19:45
In Haskell, list comprehensions are 'syntactic sugar' for conditionals and functions (or can trivially be translated into do notation and then desugared monadically). Here's the 'official' guide to translating them: http://www.haskell.org/onlinereport/haskell2010/haskellch3.html#x8-420003.11
Hence, since list comprehensions can be translated mechanically and straightforwardly into equivalent code using simply higher order functions, there is by definition nothing you can do with them that is difficult to do without them.

- 38,665
- 7
- 99
- 204
The others are correct; list comprehensions do not provide any better manipulation of sequences, per se, compared to functions like map, reduce, filter, etc. They did not really address your question as to why Python programmers trump list comprehensions over higher order functions, though.
The reason Python advocates it and Python programmers use them is because according to Guido, the language creator, list comprehensions (and set comprehensions and dict compressions and generator expressions) are easier to read and to write than functional expressions. Python's philosophy is that readability trumps all.
Guido dislikes functional programming constructs in general, and was wary about adding lambda
syntax. It is just a matter of style and taste, not expressiveness or power. His opinions shape Python and how it is written.
For more details, here is a proposal by Guido to remove lambda
, map
, filter
, and reduce
from Python 3 and up. It was not implemented (except for the removal of reduce
, which is no longer a builtin function) , but he lays out his reasonings here: http://www.artima.com/weblogs/viewpost.jsp?thread=98196
He sums it up as follows, though:
filter(P, S) is almost always written clearer as [x for x in S if P(x)], and this has the huge advantage that the most common usages involve predicates that are comparisons, e.g. x==42, and defining a lambda for that just requires much more effort for the reader (plus the lambda is slower than the list comprehension).

- 1,990
- 13
- 19
-
1My opinion personally? I generally agree. List comprehensions are clearer, and are closer to how I generaly think about problems. And in the case of `reduce`, you can express what you're doing much more clearly with a regular `for` loop. I have never had a need to use such higher order functions. – Anorov May 28 '13 at 06:03
-
5A loop? You think [this](http://pastebin.com/GTskRwaz) is cleaner than [this](http://pastebin.com/XvPdKEAn) ? – MaiaVictor May 28 '13 at 06:08
-
@Viclib Yes. Especially if you need to do more than just apply a single operator, which many times you do. Plus, the `sum` function already exists in Python, making it even less practical. – Anorov May 28 '13 at 06:45
-
3Part of the problem is that Python has poor support for point-free programming. Yes it has functools.partial, but that usually ends up looking uglier than the equivalent list comprehension. Also Python doesn't have infix functions in the first place, so (==42) would be `lambda x: x == 42`. Ugly. – Wes May 28 '13 at 07:16
-
@Wes Yep, this is a valid point. Higher-level functions are often more verbose in Python, but just because they are in Python doesn't mean they are in other languages. – Anorov May 28 '13 at 08:07
-
Yep, that's the deal, maybe it's a problem with Python's syntax for HOFs. – MaiaVictor May 28 '13 at 08:45
-
1Loops are imperative and noisy. The right recursion scheme for a structural operation on an algebraic data type is plainly clearer. – nomen Jun 06 '13 at 22:27
compare
[[x*x, x*x+x ..] | x <- [2..]]
and
map (\x-> map (*x) $ enumFrom x) $ enumFrom 2
The first is obviously more readable. You asked "tricky", not "impossible". And with filter
, there's nothing to indicate whether we're filtering in, or out the elements that pass, or fail, the given test. With LCs it is visually manifest.
So whenever there's an LC formulation, it is preferred IMO, just for the readability of it. Haskell's LC syntax is especially succinct and clear, clearer than Python's IMO (less noise). Shame not to use it. :)

- 70,110
- 9
- 98
- 181
-
You don't need list comprehensions to use list enumerations. **\x -> [x * x, x * x + x ..] `fmap` [2..]** (Note that there are supposed to be back-ticks around the fmap...) – nomen Jun 06 '13 at 22:23
-
@nomen ah, you're right, they are different (but look similar). you can use double back-ticks to enclose the code block in comments btw, like so: ``(\x-> map (*x) [x..]) `map` [2..]``. – Will Ness Jun 07 '13 at 04:16
I very rarely use list comprehensions for exactly the reasons in this question. However, there is one case where I've found them to be the only concise syntax: when a refutable pattern is to the left of the <-
. Example:
data Foo = Bar Int | Baz String
getBazs :: [Foo] -> [String]
getBazs xs = [x | Baz x <- xs]
To write that without a list comprehension, you'd have to do something a lot longer like this:
data Foo = Bar Int | Baz String
getBazs :: [Foo] -> [String]
getBazs = foldr go []
where go (Baz x) acc = x:acc
go _ acc = acc
But unlike the list comprehension, that isn't a "good producer", so its output list won't be able to fuse with anything. To fix that, you'd have to either add the rewrite rules by hand, or switch to importing a different function that is a good producer:
import Data.Maybe
data Foo = Bar Int | Baz String
getBazs :: [Foo] -> [String]
getBazs = mapMaybe go
where go (Baz x) = Just x
go _ = Nothing
End result is it's a lot more to think about, and a lot more code, than the basic list comprehension, for the same result.

- 45,431
- 5
- 48
- 98