50

There is a semi-famous article written by Guido himself hinting that reduce() should go the way of the dodo and leave the language. It was even demoted from being a top-level function in Python 3 (instead getting stuffed in the functools module).

With many other functional programming staples (map, etc) common clear alternatives are available. For example, most of the time a map() is better written as a list comprehension.

What I'd like to know is if there is a similar "more pythonic" alternative to the reduce function. I have a bit of a functional programming background (ML in particular), so reduce() often springs to my mind when thinking of a solution, but if there's a better way to do them (short of unrolling a reduce call into a for loop) I'd like to know.

PeeHaa
  • 71,436
  • 58
  • 190
  • 262
Adam Parkin
  • 17,891
  • 17
  • 66
  • 87
  • 6
    Well you can always just do an `from functools import reduce`. – Niklas B. Feb 28 '12 at 00:11
  • 1
    If thinking in terms of `reduce`, `fold`, `map` etc. suits you, I don't think you should change your way of thinking. You're on good tracks already. – Irfy Feb 28 '12 at 00:13
  • @NiklasB.: I know you can import it from functools (that fact was alluded to in my question). – Adam Parkin Feb 28 '12 at 00:28
  • 1
    Somehow in all the years I haven't used reduce even once. Maybe because the problems were too complicated to only have the function look at next item and the aggregated value of data processed so far. Usually the current position in the iterable was important to me, so I would instead write `for i, x in enumerate(lst): ...`, and since I used that so often, I wouldn't bother actually using reduce even when it fitted nicely. – Frg Feb 28 '12 at 00:30
  • @Adam: My point was that I don't understand what the question is about. No, there is no such thing as a reduce that is not a reduce. – Niklas B. Feb 28 '12 at 00:36
  • @Frg: I think `reduce(operator.mul, numbers)` is still the nicest way to compute the product of `numbers`. – Sven Marnach Feb 28 '12 at 00:37
  • @Frg: stricly speaking `reduce()` is syntactic sugar, so not entirely surprising you've never used it. However, it's extremely __concise__ syntactic sugar, and in many functional languages (Haskell, ML, etc) it's often *the* most elegant way to solve a surprisingly large number of problems. – Adam Parkin Feb 28 '12 at 05:02
  • Well... if I came from a functional language background, I would definitely use `reduce`, even if only as a form of old habit that dies hard. Luckily this function will only disappear from the builtins and not disappear completely (functools module has it), because then everybody and their mother would just start defining their own `reduce` in all the scipts. :) – Frg Feb 28 '12 at 17:10
  • @Frg: LOL, true, we tend to hold onto those old familiar constructs, kicking and screaming, don't we? ;) – Adam Parkin Feb 28 '12 at 17:17
  • FYI: List comprehensions and loops are only an alternative in some cases. They cannot be programmatically constructed or called, ie partial(map, fn) – dietbuddha Nov 24 '12 at 22:36

3 Answers3

36

As Guido's linked article says, you should just write an explicit for loop if you want to avoid reduce(). You can replace the line

result = reduce(function, iterable, start)

by

result = start
for x in iterable:
    result = function(result, x)
Sven Marnach
  • 574,206
  • 118
  • 941
  • 841
  • 13
    Also note, this isn't quite 100% equivalent -- `reduce()` as defined in the standard library doesn't require a start argument. So then you have to put a guard around the for loop to check if iterable is empty, if it's not, take the 1st item as the start, and loop over the remaining items. Suddenly that 1-line `reduce()` looks even more attractive regardless of what Guido thinks. ;) – Adam Parkin Feb 28 '12 at 17:00
  • 2
    @AdamParkin: Your last comment does not make too much sence. The two code snippets are 100% equivalent (for practical purposes). You *could* call `reduce()` without `start` argument, and then the equivalent would look different, but you wouldn't need to special-case empty iterables any more than you would need to when using `reduce()` without `start`. – Sven Marnach Feb 28 '12 at 17:05
  • 2
    `reduce(set.intersection, list_of_some_sets)` would reduce down to the intersect of all sets defined in list_of_some_sets (it implicitly uses the 1st elt of the list as the start). To unroll this into a loop I'd have to extract the 1st elt (well, any element) use that as my "start", & then enter the loop iterating over the remaining n-1 elements. However, pulling that 1st (or whatever) item from the list would only be valid if you know the list is not empty, hence the need for a guard. Look at the [definition in the docs](http://docs.python.org/library/functions.html#reduce) for what I mean. – Adam Parkin Feb 28 '12 at 17:14
  • @AdamParkin: That far I agree. But the `reduce()` call will also error out if `list_of_some_sets` is empty, so there's no difference. – Sven Marnach Feb 28 '12 at 17:18
  • @AdamParkin: The try/except in the documentation example only translates `StopIteration` to `TypeError`. If you insist on a `TypeError`, you'd need a try/except block, yes. – Sven Marnach Feb 28 '12 at 17:20
  • Is there a significant performance difference between the two? – jpmc26 Sep 24 '19 at 11:40
8

What I'd like to know is if there is a similar "more pythonic" alternative to the reduce function.

Yes and no. It depends upon the use case.

In the linked article Guido suggests that most but not all reductions ought to be written as loops. There are limited circumstances in which he sees reduce as being applicable.

So in my mind, the applicability of reduce() is pretty much limited to associative operators, and in all other cases it's better to write out the accumulation loop explicitly.

There aren't a whole lot of associative operators. (Those are operators X for which (a X b) X c equals a X (b X c).) I think it's just about limited to +, *, &, |, ^, and shortcut and/or.

kkurian
  • 3,844
  • 3
  • 30
  • 49
  • Great point, and I agree, though this isn't really an answer to the question, but rather a (quite good) insight related to the article cited in the question. Perhaps this would be better suited as a comment on the question than a suggested answer? – Adam Parkin Apr 28 '17 at 17:31
  • I ran across your answer when deciding if I should use `reduce` in my code. In my case I'm reducing to a set of unique values from a very very large list. So for what it's worth, I'd like to include a set union in your set of associative operators. – Rich Nov 15 '17 at 07:25
  • That part of the answer is quoting Guido, so your beef is with him. :-) Also note that the pipe operator, as shown, could be set union. https://docs.python.org/3.6/library/stdtypes.html?highlight=set#frozenset.union – kkurian Nov 15 '17 at 08:01
  • Which was strange to me, because for those situations would just use `sum`, `numpy.product`, `numpy.bitwise_and` `numpy.bitwise_or`, `numpy.bitwise_xor`, `all`, `any`. It is all the other situations where I reach for reduce :) – pavon Oct 11 '18 at 01:02
1

The two-line alternative for python>=3.8

result = your_array[0]
[result := your_func(result, x) for x in your_array[1:]]

Example for calculating the Kronecker product:

list_of_matrices = [
   [[1,2],
    [3,4]],

   [[5,6,2],
    [7,2,1]],

   [[9,8,7],
    [1,4,5],
    [8,2,1],
    [1,1,2]],
    ]*4

result_reduce = reduce(np.kron, list_of_matrices) 

result_walrus = list_of_matrices[0]
[result_walrus := np.kron(result_walrus, x) for x in list_of_matrices[1:]]

print((result_walrus == result_reduce).all())

Output:

True

Timings:

Reduce 30 times: 10.39 sec.
Walrus 30 times: 10.37 sec.
voldr
  • 272
  • 1
  • 11