72

When would you NOT want to use functional programming? What is it not so good at?

I am more looking for disadvantages of the paradigm as a whole, not things like "not widely used", or "no good debugger available". Those answers may be correct as of now, but they deal with FP being a new concept (an unavoidable issue) and not any inherent qualities.

Related:

Community
  • 1
  • 1
Gordon Gustafson
  • 40,133
  • 25
  • 115
  • 157

8 Answers8

45

It's hard for me to think of many downsides to functional programming. Then again, I am a former chair of the International Conference on Functional Programming, so you may safely assume I am biased.

I think the main downsides have to do with isolation and with barriers to entry. Learning to write good functional programs means learning to think differently, and to do it well requires a substantial investment of time and effort. It is difficult to learn without a teacher. These properties lead to some downsides:

  • It is likely that a functional program written by a newcomer will be unnecessarily slow—more likely than, say, a C program written by a newcomer to C. On the other hand, it is about equally likely that a C++ program written by a newcomer will be unnecessarily slow. (All those shiny features...)

    Generally experts have no difficulty writing fast functional programs; and in fact some of the best-performing parallel programs on 8- and 16-core processors are now written in Haskell.

  • It's more likely that someone starting functional programming will give up before realizing the promised productivity gains than will someone starting, say, Python or Visual Basic. There just isn't as much support in the form of books and development tools.

  • There are fewer people to talk to. Stackoverflow is a good example; relatively few Haskell programmers visit the site regularly (although part of this is that Haskell programmers have their own lively forums which are much older and better established than Stackoverflow).

    It's also true that you can't talk to your neighbor very easily, because functional-programming concepts are harder to teach and harder to learn than the object-oriented concepts behind languages like Smalltalk, Ruby, and C++. And also, the object-oriented community has spent years developing good explanations for what they do, whereas the functional-programming community seem to think that their stuff is obviously great and doesn't require any special metaphors or vocabulary for explanation. (They are wrong. I am still waiting for the first great book Functional Design Patterns.)

  • A well-known downside of lazy functional programming (applies to Haskell or Clean but not to ML or Scheme or Clojure) is that it is very difficult to predict the time and space costs of evaluating a lazy functional program—even experts can't do it. This problem is fundamental to the paradigm and is not going away. There are excellent tools for discovering time and space behavior post facto, but to use them effectively you have to be expert already.

Norman Ramsey
  • 198,648
  • 61
  • 360
  • 533
  • 9
    "Generally experts have no difficulty writing fast functional programs; and in fact some of the best-performing parallel programs on 8- and 16-core processors are now written in Haskell". Only for the most trivial problems. See http://flyingfrogblog.blogspot.com/2010/06/regular-shape-polymorphic-parallel.html – J D Nov 28 '10 at 22:15
  • 5
    @Jon: It depends thoroughly on the exact nature of the problem and on what sort of cache locality you get. Measurement (of which benchmarking is only one type) will show which is best; pontificating on a webpage won't. – Donal Fellows Nov 29 '10 at 00:42
  • 1
    @Donal: Calculating the *cache complexity* of your algorithm under a simple multicore model is also a strong indicator of which is best, as I said on that web page. – J D Nov 29 '10 at 10:23
  • @Jon: Well, that works for some algorithms (or rather some tasks admitting solution by algorithms, to be accurate) but others remain tricky – open research problems in algorithmic design – and some are known to be nasty. (Some tasks in program analysis are really difficult to get cache locality for except in very restricted cases.) – Donal Fellows Nov 29 '10 at 11:19
  • Texas Multicore's recent results for Simon Marlows parallel Haskell benchmark `matmult` that show no speedup on a modern multicore are another counter example to Norman's assertion: http://texasmulticoretechnologies.com/technology/example – J D Dec 12 '10 at 11:48
  • 2
    I couldn't think of a better thread (well, one or two maybe) to help the interested reader reach their own conclusion on the precise merits (or lack thereof) of jdh's opinions. – sclv Dec 18 '10 at 04:32
  • 1
    @sclv: Where is the evidence to support Norman's claim that "Generally experts have no difficulty writing fast functional programs; and in fact some of the best-performing parallel programs on 8- and 16-core processors are now written in Haskell"? Where are these fast parallel Haskell programs? – J D Dec 21 '10 at 15:54
  • 8
    For the record -- my comment was in reply to a now deleted comment by jdh pointing to a rather painful to read reddit thread. I won't be responding here any further, as I don't like the fact that jdh can delete comments without providing any trace, and has shown his willingness to do so. – sclv Dec 21 '10 at 17:15
  • @sclv: I don't recall deleting my comment. – J D Aug 31 '11 at 20:07
30

One big disadvantage to functional programming is that on a theoretical level, it doesn't match the hardware as well as most imperative languages. (This is the flip side of one of its obvious strengths, being able to express what you want done rather than how you want the computer to do it.)

For example, functional programming makes heavy use of recursion. This is fine in pure lambda calculus because mathematics' "stack" is unlimited. Of course, on real hardware, the stack is very much finite. Naively recursing over a large dataset can make your program go boom. Most functional languages optimize tail recursion so that this doesn't happen, but making an algorithm tail recursive can force you to do some rather unbeautiful code gymnastics (e.g., a tail-recursive map function creates a backwards list or has to build up a difference list, so it has to do extra work to get back to a normal mapped list in the correct order compared to the non-tail-recursive version).

(Thanks to Jared Updike for the difference list suggestion.)

Chuck
  • 234,037
  • 30
  • 302
  • 389
  • 9
    This highlights an interesting problem with FP: programming effectively in FP requires you to know certain tricks---especially dealing with laziness. In your example, it is actually easy to keep your code tail recursive (using a strict left fold) and avoid having things blow up on you. You don't have build the list backwards and reverse the return list. The trick is to use difference lists: http://en.wikipedia.org/wiki/Difference_list . A lot of these sorts of tricks are not so easy to figure out on your own. Luckily the Haskell community is super friendly (IRC channel, mailing lists). – Jared Updike Nov 24 '09 at 01:33
  • 4
    Thanks, Jared. Good info. In defense of my description, though, the OCaml standard library does it the way I said (stack-limited `map` and tail-recursive `rev_map`). – Chuck Nov 24 '09 at 03:47
23

If your language does not provide good mechanisms to plumb state/exception behavior through your program (e.g. syntax sugars for monadic binds) then any task involving state/exceptions becomes a chore. (Even with these sugars, some people might find it harder to deal with state/exceptions in FP.)

Functional idioms often do lots of inversion-of-control or laziness, which often has a negative impact on debugging (using a debugger). (This is somewhat offset by FP being much less error-prone due to immutability/referential transparency, which means you'll need to debug less often.)

Brian
  • 117,631
  • 17
  • 236
  • 300
  • 5
    "immutability/referential transparency, which means you'll need to debug less often" ... and since everything is built of little independent functions, you can just test those directly; if each function is (a) a correct little function or (b) a correct composition of two or more correct little functions then wham! your program is correct. – Jared Updike Feb 16 '10 at 20:11
13

Philip Wadler wrote a paper about this (called Why No One Uses Functional Programming Languages) and addressed the practical pitfalls stopping people from using FP languages:

Update: inaccessible old link for those with ACM access:

Jared Updike
  • 7,165
  • 8
  • 46
  • 72
  • 9
    please post the relevant text of the articles. :D – Gordon Gustafson Nov 24 '09 at 01:27
  • @CrazyJugglerDrummer: I think that whole article is about this ;-) – Hynek -Pichi- Vychodil Nov 24 '09 at 17:52
  • 1
    I know, but I'd much rather be able to look at it somehow without downloading and opening it. Is that possible? – Gordon Gustafson Nov 25 '09 at 01:14
  • 2
    Sorry about the inaccessible link. I would post HTML of text but the PS/PDF is actually an image and I don't have OCR software on hand. I suppose I could post a PDF of it somewhere. Not sure why ACM hides some of these older articles; don't they want to disseminate this information. – Jared Updike Dec 01 '09 at 21:51
  • 2
    Online converter for postscript files :-D http://view.samurajdata.se/ps.php?url=http%3A%2F%2Fwww.cse.iitb.ac.in%2F~as%2Ffpcourse%2Fsigplan-why.ps.gz&submit=View! – Pavel Savara Sep 22 '10 at 21:44
8

Aside from speed or adoption issues and addressing a more basic issue, I've heard it put that with functional programming, it's very easy to add new functions for existing datatypes, but it's "hard" to add new datatypes. Consider:

(Written in SMLnj. Also, please excuse the somewhat contrived example.)

datatype Animal = Dog | Cat;

fun happyNoise(Dog) = "pant pant"
  | happyNoise(Cat) = "purrrr";

fun excitedNoise(Dog) = "bark!"
  | excitedNoise(Cat) = "meow!";

I can very quickly add the following:

fun angryNoise(Dog) = "grrrrrr"
  | angryNoise(Cat) = "hisssss";

However, if I add a new type to Animal, I have to go through each function to add support for it:

datatype Animal = Dog | Cat | Chicken;

fun happyNoise(Dog) = "pant pant"
  | happyNoise(Cat) = "purrrr"
  | happyNoise(Chicken) = "cluck cluck";

fun excitedNoise(Dog) = "bark!"
  | excitedNoise(Cat) = "meow!"
  | excitedNoise(Chicken) = "cock-a-doodle-doo!";

fun angryNoise(Dog) = "grrrrrr"
  | angryNoise(Cat) = "hisssss"
  | angryNoise(Chicken) = "squaaaawk!";

Notice, though, that the exact opposite is true for object-oriented languages. It's very easy to add a new subclass to an abstract class, but it can be tedious if you want to add a new abstract method to the abstract class/interface for all subclasses to implement.

Ben Torell
  • 2,053
  • 1
  • 12
  • 15
  • 9
    If you implemented these as subclasses of an abstract class in an OO, you'd have to write all those new functions as well. The only difference is how you organize the functions (by type or by behavior). – Chuck Nov 24 '09 at 00:58
  • @Chuck, True, fair enough. The idea I was trying to get at is that in OOP, while you still have to write the implementations of your methods, it's all done internal to the class. By adding a new subclass, you don't have to modify any siblings or parents. In fact, decent IDEs will auto-fill a skeleton subclass for you when you create it with blank methods for you to implement. But if you add a new method to the superclass, it breaks all implementing classes. The reverse is true for functional. Point well-taken. – Ben Torell Nov 24 '09 at 01:06
  • 10
    This has been named the *Expression Problem* by none other than Philip Wadler. – Jörg W Mittag Nov 24 '09 at 01:09
  • 3
    Wadler calls this the expression problem: http://en.wikipedia.org/wiki/Expression_Problem – Jared Updike Nov 24 '09 at 01:14
  • 1
    What you have are algebraic datatypes - They are considered closed, but extensible! If you want extensibility, you need inheritance or typeclasses/existentials. – Dario Nov 24 '09 at 17:16
  • 1
    This has nothing to do with functional programming. Standard ML, F# and Haskell are afflicted by this problem. Mathematica, OCaml and Clojure are not. – J D Nov 29 '10 at 22:40
  • @JD how do languages like Clojure get past the expression problem? Is it because of the dynamic typing? – David Mays Aug 10 '23 at 14:38
3

I just wanted to buzz in with an anecdote because I'm learning Haskell right now as we speak. I'm learning Haskell because the idea of separating functions from actions appeals to me and there are some really sexy theories behind implicit parallelization because of the isolation of pure functions from non-pure functions.

I've been learning the fold class of functions now for three days. Fold seems to have a very simple application: taking a list and reducing it to a single value. Haskell implements a foldl, and foldr for this. The two functions have massively different implementations. There is an alternate implementation of foldl, called foldl'. On top of this there is version with a slightly different syntax called foldr1 and foldl1 with different initial values. Of which there is a correspond implementation of foldl1' for foldl1. As if all of this wasn't mind blowing, the functions that fold[lr].* require as arguments and use internally in the reduction have two separate signatures, only one variant works on infinite lists (r), and only one of them is executed in constant memory (as I understand (L) because only it requires a redex). Understanding why foldr can work on infinite lists requires at least a decent understanding of the languages lazy-behavoir and the minor detail that not all functions will force the evaluation of second argument. The graphs online for these functions are confusing as hell for someone who never saw them in college. There is no perldoc equivalent. I can't find a single description of what any of the functions in the Haskell prelude do. The prelude is a kinda of a distribution preloaded that comes with core. My best resource is really a guy I've never met (Cale) who is helping me at a huge expense to his own time.

Oh, and fold doesn't have to reduce the list to a non-list type scalar, the identity function for lists can be written foldr (:) [] [1,2,3,4] (highlights that you can accumulate to a list).

/me goes back to reading.

Evan Carroll
  • 78,363
  • 46
  • 261
  • 468
  • 2
    These problems were created by non-strict evaluation so they are Haskell specific. – J D Nov 28 '10 at 22:20
2

Here are some problems I've run into:

  1. Most people find functional programming to be difficult to understand. This means it will probably be harder for you to write functional code, and it will almost certainly be harder for someone else to pick it up.
  2. Functional programming languages are usually slower than a language like c would be. This is becoming less of an issue over time (because computers are getting faster, and compilers are getting smarter)
  3. Not being as wide spread as their imperative counterparts, it can be difficult to find libraries and examples for common programming problems. (For example its almost always easier to find something for Python, then it is for Haskell)
  4. There's a lack of tools, particularly for debugging. Its definitely not as easy as opening up Visual Studio for C#, or eclipse for Java.
Caleb
  • 9,272
  • 38
  • 30
  • 6
    Do you have any figures or references to support number 2? Also for number 4, F# will be a first class fully supported language in Visual Studio 2010 – Russ Cam Nov 24 '09 at 00:32
  • 8
    I think bullets 2-4 are not intrinsic to functional programming, but more artifacts of history/culture/etc. (That is, though they may be true, they are not true 'because of FP', I think.) – Brian Nov 24 '09 at 00:58
  • 4
    Re 1: I don't think that's true. Excel is a functional programming language, and I haven't observed it being harder to understand than, say, C, BASIC, Pascal or Python. In fact, it's probably the other way around. – Jörg W Mittag Nov 24 '09 at 01:14
  • 6
    Re 2: Languages cannot be slower (or faster) than another language. Languages are just abstract rules, you cannot execute them. Only *implementations* can be slower or faster than other implementations, but then you are no longer talking about languages. In the end, to solve the same problem you need to take the same steps, therefore the performance is going to be the same. The Supero Haskell compiler for instance produces code that runs 10% faster than hand-optimized C code compiled by GCC. Well-implemented Scheme compilers produce code that is between half as fast and twice as fast as GCC. – Jörg W Mittag Nov 24 '09 at 01:25
  • 1
    Slower? Functional programming language implementations tend to be on the faster end of the pool. Almost every compiler in existence tends to produce slower programs than C, so I don't see how that reflects on functional programming at all. – Chuck Nov 24 '09 at 01:34
  • 1
    Re 2 cont'd: the GHC Haskell compiler, ATS, Scala and O'Caml seem to consistently perform comparable to Java. The SISAL programming language was known to consistently perform within 20% of hand-optimized C and Fortran on single-processor machines and outperform them on multiprocessor Cray supercomputers, in one case, a FFT routine written in SISAL running on a Cray supercomputer outperformed the FFT routine shipped *by Cray* and hand-optimized by their engineers. – Jörg W Mittag Nov 24 '09 at 01:41
  • 3
    Re 4: I'm pretty sure, anybody who has ever used the Lisp Machine IDE in the 1990s would be amazed about how crappy Eclipse and Visual Studio *still* are, almost 20 years later. Anyway, this has nothing to do with functional programming. How good Visual Studio is is a feature of Visual Studio, not imperative programming. In fact, the F# Visual Studio plugin has pretty much the exact same features as the C# and VB.NET plugins. And where there is functionality missing, it has nothing to with functional programming and everything to do with the amount of money Microsoft has allocated for F# v C#. – Jörg W Mittag Nov 24 '09 at 01:47
  • 1
    @JörgWMittag: "Excel is a functional programming language". The term "functional programming" refers primarily (following Lisp) to the use of higher-order functions. How common are they in Excel? How common is mutation via scripts? – J D Nov 16 '11 at 12:28
  • 1
    @JörgWMittag: "Languages cannot be slower (or faster) than another language". The sufficiently smart compiler myth. You can easily construct a programming language that makes it impossible to attain competitive performance in practice. FPL researchers are experts at this. – J D Nov 16 '11 at 12:30
  • 1
    @JörgWMittag: "GHC Haskell compiler, ATS, Scala and O'Caml seem to consistently perform comparable to Java". Why compare with notoriously slow imperative languages like Java? GHC, Scala and OCaml are all *much* slower than C# at filling a hash table. None of them can even express what is required to make it fast in the general case (reified generics and value types). – J D Nov 16 '11 at 12:32
  • @JörgWMittag: "The SISAL programming language was known to consistently perform within 20% of hand-optimized C and Fortran..." What market share did SISAL capture in the HPC sector? – J D Nov 16 '11 at 12:34
  • @JörgWMittag 2 point -- one part of slowness arises from algorithms working with immutable types having worse complexity class than that of mutable. If you want to prove me wrong, I invite you to solve https://softwareengineering.stackexchange.com/questions/384524/functional-programming-changing-correlating-elements-during-list-mapping in O(n) fashion (which would be easily attainable using mutability and pointers) – Coderino Javarino Dec 30 '18 at 10:53
0

Looking away from the details of specific implementations of functional programming, I see two key issues:

  1. It seems comparatively rare that it is practical to choose a functional model of some real-world problem over an imperative one. When the problem domain is imperative, using a language with that characteristic is a natural and reasonable choice (since it is in general advisable to minimize the distance between the specification and implementation as part of reducing the number of subtle bugs). Yes, this can be overcome by a smart-enough coder, but if you need Rock Star Coders for the task, it's because it's too bloody hard.

  2. For some reason that I never really understood, functional programming languages (or perhaps their implementations or communities?) are much more likely to want to have everything in their language. There's much less use of libraries written in other languages. If someone else has a particularly good implementation of some complex operation, it makes much more sense to use that instead of making your own. I suspect that this is in part a consequence of the use of complex runtimes which make handling foreign code (and especially doing it efficiently) rather difficult. I'd love to be proved wrong on this point.

I suppose these both come back to a general lack of pragmatism caused by functional programming being much more strongly used by programming researchers than common coders. A good tool can enable an expert to great things, but a great tool is one that enables the common man to approach what an expert can do normally, because that's by far the more difficult task.

Robert Harvey
  • 178,213
  • 47
  • 333
  • 501
Donal Fellows
  • 133,037
  • 18
  • 149
  • 215
  • 2
    And it should also be noted that many languages are not pure imperative or pure functional, no matter how they're conventionally taught. There is considerable cross-fertilization. – Donal Fellows Nov 29 '10 at 01:09