3

What are the reason/s behind other languages not having a Garbage Collector?

Why garbage collection is not built-in for these other languages? why do programmers given the responsibility to collect?

Aquarius_Girl
  • 21,790
  • 65
  • 230
  • 411
anonymous
  • 81
  • 2
  • 5
  • Not an exact duplicate, but start here for an informed discussion: http://stackoverflow.com/questions/147130/why-doesnt-c-have-a-garbage-collector – Michael Petrotta Mar 15 '10 at 02:48
  • 1
    So which question do you want answered. The one in the title or the one in the body? – JohnFx Mar 15 '10 at 02:55
  • @JohnFx: So what is the big difference between "Why" and "What are the reasons" (for other languages not having a garbage collector)? – Thilo Mar 15 '10 at 03:32
  • Trying to think of post-Java languages (C# is a Java dialect, which has acquired weird growth over the years; Ruby, Python and JavaScript are of around the same age). – Tom Hawtin - tackline Mar 15 '10 at 03:38
  • @Thilo: Maybe I'm hallucinating (it happens), but I swear the original title of this question was "What other languages..." – JohnFx Mar 15 '10 at 03:47
  • I don't think I agree that C# is a Java dialect; it would probably be more accurate to say that both Java and C# copied a lot of C/C++ semantics in their designs. C# actually copied almost as much from Delphi as it did from C++ (same designer, no surprise). – Aaronaught Mar 15 '10 at 03:48
  • 1
    @Tom: I find it hard to imagine how a post-1990 language without some kind of GC would thrive, except a version of an older language. Or an assembler language. Java has it for a good reason. All I can think is that Perl 5 is about contemporary with Java, and only had reference-counted, not mark-sweep GC. C and C++ satisfy almost all of the commercial demand to create memory leaks and dangling pointers, so I guess maybe look at fairly niche languages. Wikipedia says Ada implementations typically don't support GC, and the last version of that was 2005. – Steve Jessop Mar 15 '10 at 04:26
  • 1
    Rather than "languages", I think the word to be used here is "platforms". I can write C++ on .NET, a completely managed and garbage collected platform. I could also write C# on a non-.NET platform that is unmanaged. – Travis Heseman Mar 15 '10 at 13:00
  • There are other resources to be managed besides memory, so having memory managed while other resources not looks like a partial solution to me. – n0rd Mar 15 '10 at 13:07
  • @n0rd: The distinction shouldn't be between "memory" and "other", but between "values", "value holders", and "entities". Entities generally have clear identities and a clear owners who will know when the entities are no longer needed. Holders of values that might change will also generally have clear identities and ownership. Values that won't change, however, often don't have a real identity or owner. If two objects both have `String` fields that encapsulate the character sequence "Fred", either object may know when it no longer needs "Fred", but should have no reason to care about... – supercat Apr 02 '15 at 19:05
  • ...whether the object (or anyone else) might be using the same `String` object. Unless one uses something like `WeakReference` to force a `String` to behave as an entity, a string will effectively cease to exist the *instant* there is no longer a reachable reference path to it. The GC doesn't destroy objects other than those reachable only by things like weak references; instead, it reclaims memory formerly used by *non-existent objects*. – supercat Apr 02 '15 at 19:18

10 Answers10

22

Reasons not to have garbage collection:

  • Really efficient collectors weren't developed until around 1985–1990. Languages designed before that time, if efficiency was a goal, don't have garbage collection. Examples: Ada, C, Fortran, Modula-2, Pascal.

  • Bjarne Stroustrup thinks it is better language design to make every cost explicit, and "not to pay for features you don't use." (See his papers in the 2nd and 3rd ACM Conferences on the History of Programming Languages.) Therefore C++ doesn't have garbage collection.

  • Some research languages use other ideas (regions, fancy type systems) to manage memory explicitly, yet safely. These ideas have special promise for problems such as device drivers, where you may not be able to afford to allocate, or for real-time systems, where memory costs must be very predictable.

Norman Ramsey
  • 198,648
  • 61
  • 360
  • 533
  • 4
    +1. * Garbage collection only helps manage memory. Other patterns, like RAII let you deterministically manage all resources, including memory, in a consistent way. – Adrian McCarthy Mar 15 '10 at 12:55
6

The hardware has no garbage collector (there was some hardware which had some elementary support for forwarding pointers, a feature useful in the construction of some garbage collectors, but that's far from a "GC in hardware"). Correspondingly, assembly has no GC. Assembly is a "programming language" (albeit one of the closest to the bare metal), so their you go: in the wide spectrum of existing programming languages, some will not have a GC.

It so happens that an efficient GC is not something which is easy to implement. Good algorithms for that have been long in the making. More importantly, most good GC algorithms are good because they perform some elaborate operations such as moving data elements in RAM; this is necessary for the "realtime GC" which offer guarantees on maximum time spent for allocation (you cannot have such guarantees when there is fragmentation, and you cannot avoid fragmentation without moving objects in RAM). When an object is moved, all pointers to that object must be automatically adjusted, which can be done only if the programming language offers strong, unescapable types. For instance, this cannot be done with C or C++. In C, it is legal to print out the bytes which encode a pointer value, and then have the user type them back. The GC cannot change the brain of the user when it moves an object...

So in practice, languages without strong types are GC-less. This includes C, C++, Forth, all kinds of assembly-with-extensions languages... This does not prevent some people to write GC implementations for such languages, e.g. Hans Boehm's GC for C and C++. It does mean, though, that the GC may fail with (weird) programs which are nominally "legal", as far as the language standard is concerned.

There are also languages with strong types but without a GC, either because their designers did not believe in it, or believed they could do better without, or cringed from the extra code size (for instance, Javacard, the Java for smartcards, is GC-less, because fitting a GC in an environment with 8 kB of code and 512 bytes of RAM is not easy).

Finally, among the thousands of programming languages which have been designed ("once per week since the sixties", I was once told), some are the result of late-at-night conversations after too much alcohol, so it cannot be assumed that every feature or non-feature of all programming languages is the result of balanced rational thinking.

Thomas Pornin
  • 72,986
  • 14
  • 147
  • 189
5

"Other languages" do - this question is tagged C# and the .NET CLR most definitely does perform automatic garbage collection.

I can think of a few reasons for C++ not to have it:

  • All existing code in C++ uses explicit memory management, so implementing garbage collection would be a breaking change;

  • By the same token. C++ programmers are already accustomed to explicit memory management, so garbage collection isn't that important a feature;

  • Good garbage collection algorithms are fairly new, and C++ predates them by quite a bit. Garbage collection is a horizontal feature and the language designers would have to make major (and complicated) changes to the spec. Put simply, it's harder to bolt on a garbage collector to an existing language than it is to design it into the language from the beginning, as it was with .NET and Java.

  • Java runs in a Virtual Machine and .NET uses something similar, whereas C++ deals with native code. GC is much easier to reason about in the former case.

  • C++ is often used for applications that need to run under tight memory requirements (i.e. embedded systems), and in these instances, explicit memory management is a necessity. I suppose some sort of "opt-in" GC could solve this, but that is even harder for the language designers to implement properly.

Aaronaught
  • 120,909
  • 25
  • 266
  • 342
  • Objective-C has optional garbage collection. I consider the feature a regression as now libraries have to be tested under both environments (not that garbage collection is a regression; it can be very handy) – rpetrich Mar 15 '10 at 03:13
  • @rpetrich: Hence my first point about it being a (potentially, likely) breaking change. Converting an app from explicit MM to GC is certainly a non-trivial task because you still have explicitly manage *resources* like file handles or sockets. – Aaronaught Mar 15 '10 at 03:18
  • 1
    @Aaronaught: um, converting a C/C++ app to use the Boehm collector I linked to below is quite simple really. And you shouldn't have to do anything to manage file handles and sockets because in most GC languages, you're still expected to manually manage file handles and sockets. Would you like some references? – msalib Mar 15 '10 at 03:54
  • @msalib: As usual you've missed the point, which was that you can't just remove/ignore all the destructors because many of those destructors will also be doing resource management in addition to memory management. Converting an entire app is "simple?" You must not work on any apps longer than a few hundred lines. – Aaronaught Mar 15 '10 at 04:00
  • 2
    @Aaronaught: Please, take a moment and skim the Boehm GC page I linked to earlier. There you will find useful information like "Empirically, this collector works with most unmodified C programs, simply by replacing malloc with GC_malloc calls, replacing realloc with GC_realloc calls, and removing free calls." And one more time, would you mind writing in a more professional manner? Are you capable of that? Can you disagree with people without using snide insults? Or is that skill beyond your abilities? – msalib Mar 15 '10 at 04:06
  • @msalib: Reread your own answer and tell me again about writing "professionally." Anyway, you've parroted a vague assertion about compatibility from a page about a product that I doubt you've used personally, based on other comments you've written here. Tell me, if you were maintaining a 100 KLOC C application, is this the logic you would use to justify your decision? "Well, the web page says it *should* work, so I don't think we need to test it or anything." – Aaronaught Mar 15 '10 at 04:13
  • 3
    @Aaronaught: I have used the Boehm collector, so you're incorrect. My experience with large C applications is that even the best developers tend to screw up manual memory allocation, and when they do, data gets overwritten leading to complex failures that are very difficult to debug. Based on that experience, I'd be happy to plug in the Boehm collector if the performance for my application was good enough. You might want to read more about Boehm...he's not just some random guy with a web page. And obviously, changes to something as fundamental as GC would have to be tested rigorously. – msalib Mar 15 '10 at 04:20
  • @msalib: I know who Boehm is, and the plural of "anecdote" is not "data." If you're going to rigorously test it, fine, but that's the point - it's a potentially breaking change, and it can't simply be retrofitted into the C/C++ language design because it would break existing programs. – Aaronaught Mar 15 '10 at 04:28
  • 1
    @Aaronaught: Any change should be tested. Your claim was not that switching a C program to GC was a change that warranted testing. Everyone agrees on that. Your claim was that it would break existing programs. Most changes will not break existing programs, but I still test all changes I make anyway. You also claimed that it would be very labor intensive; this also was incorrect. – msalib Mar 15 '10 at 04:42
  • @msalib: Even if you could reasonably quantify the term "most", that's not anywhere near good enough for a language designer. If a spec change causes even 0.1% of existing programs to break, it is an unmitigated disaster. And it *is* labour intensive to actually *convert* a program to a true GC model; simply "shimming" the `malloc` and `realloc` statements does not actually make the program semantically correct, it just happens to work most of the time, same way old Windows applications still work because some shim decides to redirect an API call. – Aaronaught Mar 15 '10 at 04:56
  • Note - I say shim, I know it's not really a shim, it's just a search-and-replace, but effectively it works as a compatibility shim, redirecting one operation to another. If you want to use a 3rd-party library and do your own testing, great, but it's highly illogical to suggest that the C++ language designers could do it unilaterally and not expect a significant number of programs to break, and/or that the effort required to "upgrade" programs would be trivial or even small. – Aaronaught Mar 15 '10 at 05:06
  • @Aaronaught: I'm not disagreeing with you, just adding my experience regarding Objective-C's optional garbage collection (speaking directly to your first and fifth points :) – rpetrich Mar 15 '10 at 05:40
  • C++ and C DO have an opt-in garbage collector: the Boehm Garbage Collector. It's a "GC-as-a-library". – James M. Lay Nov 27 '21 at 18:41
4

People already answered to your question, but still, your question has an hidden assertion that is "garbage collection is the solution to all problems", and I would like to discussion this assertion...

GC is not the only way to handle memory

There is at least three ways to handle memory allocation:

We agree that "Manually" can be actually cumbersome and ugly. Now, you should note that even with a GC, there are some devious ways to leak memory.

GC does not handle resource leaks

There are a lot of limited resources in a program, in addition to memory:

  • file handles
  • other OS handles
  • network connections
  • database connections
  • etc.

Those are limited resources you want to be freed as soon as they are not used anymore, instead of "not at all" or even "when the process exits".

Those resources must usually be acquired and unacquired manually in GC-powered languages (i.e. Java). If you want to see how ugly it can be, please take a look at this question:

RAII in Java... is resource disposal always so ugly?

RAII does handle memory and resource leaks

Using the RAII idiom will enable you to write readable code, without any leaks, memory or otherwise. Fact is, I can't remember a time when, writing C++ code, I was worried by memory allocation/deallocation, despite the fact I use no Garbage Collection.

But I clearly remember that in october 2008, I had to handle a resource leak in Java, and the solution was so ugly it was disgusting.

Conclusion

What are the reason/s behind other languages not having a Garbage Collector?

The answer could be:

  • because at that time GC was not effective enough
  • because GC is not the solution to all resouce leaks
  • because there is no need.

C++ is more in the "there is no need" section.

GC can be a cool bonus to C++'s RAII (see Garbage Collection in C++ -- why? ), but there's no way I would exchange my RAII for your GC.

Ever.

Community
  • 1
  • 1
paercebal
  • 81,378
  • 38
  • 130
  • 159
  • I have written a recursive program to calculate Pi to only a short decimal length in several languages and all GC languages never seem to calculate it faster than 200 ms. Now with some none GC languages I can get an unmeasurable 0ms time. So I would say use a none GC language if you have task heavy operations you need results for fast/often. – mjwrazor Apr 13 '17 at 01:54
3

Some languages are old. For example C, which was originally designed for systems programming on machines much slower than today's. Garbage collection probably didn't exist then (well, maybe Lisp?) and even if it did, the designers wouldn't have wanted to spend all the CPU cycles and memory overhead on garbage collection when the programmers could do it themselves. And since the machines were so much less powerful, software was simpler, and hence it was easier for programmers to manually manage memory than it would be in the much bigger applications which might be written today.

Peter
  • 7,216
  • 2
  • 34
  • 46
  • 6
    What do you mean "maybe Lisp". Yes, Lisp. Lisp which is much older than C. – Tom Hawtin - tackline Mar 15 '10 at 03:04
  • 1
    @Tom: Lisp is much easier to garbage collect though (and have efficient emitted code), so there's more to it than age. If garbage collection in imperative languages had been so easy when C was invented, Sun would not have spent nearly 20 years improving Java GC from their first version. They'd have effortlessly slung in generational, compacting, concurrent, real-time-guarantee GC off the shelf. – Steve Jessop Mar 15 '10 at 03:14
  • 2
    Steve Jessop: can you explain why it is easier to generate GC-friendly Lisp code than C code? I mean, most Lisp implementations compile to an intermediate language that's basically portable assembler which is basically C. Lisp is an imperative language. As for Sun, well, generally I don't buy arguments that start from the premise that big corporations execute rationally and perfectly. – msalib Mar 15 '10 at 03:31
  • @msalib: Intermediate languages are jitted, which is **completely** different from a "portable assembler", it requires a special runtime and that runtime can handle the GC. A compiled application runs under the OS and the OS *doesn't* handle GC. – Aaronaught Mar 15 '10 at 03:36
  • You don't have to "buy" that they execute rationally and perfectly, and I'm not "selling" it. You might choose to believe they cannot use off-the-shelf technology if it's not available on the shelf to begin with. The story of Java GC performance is not, AFAIK, a simple case of Sun taking a very long time to get around to looking up Scheme on Google. – Steve Jessop Mar 15 '10 at 03:45
  • 1
    @Aaronaught: Intermediate languages are used for JITting but are also used extensively in static compilation; that's why GCC has several different IRs. Besides that, some Lisp implementations compile directly to C and use the system C compiler to build the resulting image. I have no idea what you're talking about wrt compiled applications and the OS handling GC...did anyone say anything about that? Obviously, the Common Lisp runtime is not identical to libc, but...so what? – msalib Mar 15 '10 at 03:48
  • @Aaronaught: I wasn't even going into that "basically C" nonsense. The output of a Lisp compiler is a *subset* of all possible assembler programs, preserving certain constraints designed to permit all kinds of interesting properties including GC. I find it hard really to grasp the deduction that because Lisp can be compiled, therefore any assembler program can be GCed. You way as well say that because C++ has strong typing, assembler is type-safe. – Steve Jessop Mar 15 '10 at 03:50
  • Steve Jessop, I certainly agree with you that Sun's problems were not just laziness about reading old Scheme papers. It is a hard problem. But I still don't understand what you mean about Lisp being more GC-friendly...can you explain please? – msalib Mar 15 '10 at 03:50
  • 1
    Steve Jessop, I never meant to imply that "because Lisp can be compiled, any assembler program can be GCed"...I'm not even sure what that means. My point it just that many Lisp implementations use a C-like (in some cases, precisely C) IR. And many of them use foreign function interfaces that interoperate quite well with C code. – msalib Mar 15 '10 at 03:52
  • @msalib: None of this is pertinent to the topic at hand, which is why C++ doesn't offer automatic garbage collection. Your argument so far is "lisp had it, so C should have had it." After that was debunked, you resorted to "well, lisp compiles to IL which is 'basically C' so therefore C should be just as easy to implement a GC for." This doesn't make *any* sense. – Aaronaught Mar 15 '10 at 04:04
  • @Steve Jessop: Exactly, *thank you*, somebody who gets it. If somebody can show me a Lisp implementation with garbage collection that would have offered acceptable performance on entry-level 1970s hardware, I would *love* to see it. I've got nothing against Lisp or GC, but the fact of the matter is that C's minimalist approach was exactly what people needed back then. *Today* is a different story entirely but we are talking about a time when hardware was *very* expensive and every cycle and every byte counted. – Aaronaught Mar 15 '10 at 04:06
  • 1
    @Steve Jessop: I don't think a GC from the time would have worked for the kernel, but C was used to write many userland programs which I believe would have been fine. Certainly, many programs of the time were not long running so GC was irrelevant: if you start up a process from inetd to handle a single request, you're going to die long before you ever get a chance to GC. Ditto for short lived programs run from the shell. Don't you agree? – msalib Mar 15 '10 at 04:14
  • I also think Sun's performance problems circa 96 are not necessarily indicative of the problems K&R faced circa 72 in that hardware had improved differentially. I really don't understand your immutable comment: the standard data structure used in Lisp is a mutable cons cell. Lisp does not emphasize immutability at all. Are you confusing it with Haskell? – msalib Mar 15 '10 at 04:16
  • If you're correct that C programs would be fine because GC would never have time to kick in and ruin performance, then in fact those C programs don't need GC or manual MM - they could cleanup at exit. I don't see what GC would bring to the party. I guess with Lisp we'd need to analyse typical programs of the time and see whether their GC performance did in fact benefit from a lower rate of mutation than typical C programs. I could be off the mark: maybe Lisp circa 1972 wasn't even bothering with incremental collection. But I don't think stop-the-world would be a good fit for C either. – Steve Jessop Mar 15 '10 at 04:42
  • Steve Jessop: I think there are many programs which are short lived on average but may in some cases be long lived. For those long lived instances, you definitely want GC. To be honest, I've never seen a stop-the-world GC. Maybe they existed in the 1970s though. Manual allocation also requires potentially significant amounts of work at arbitrary times, but this fact seems to be ignored. Again, can you explain your immutability comment? – msalib Mar 15 '10 at 04:46
  • But more importantly, if you're correct that GC was no good for the kernel, then GC was no good for C. Ritchie was hardly going to bloat C with features that got in the way of systems programming. If users wanted features from Lisp, they could/should have written their programs in Lisp. – Steve Jessop Mar 15 '10 at 04:48
  • @Steve Jessop: Systems programming != kernel programming. – msalib Mar 15 '10 at 04:50
  • "For those long lived instances, you definitely want GC". Not if, as you suggested earlier, the only reason GC was tolerable in the first place was that it never actually ran. Regarding immutable, my understand was that Lisp programmers use "set" etc. sparingly. If true, this would reduce the amount of work that a GC has to do, compared with the typical C systems program which does little but mutate. I don't think I can say the same thing again in any more ways, so if you still think it would have been easy to implement performant GC for C in 1972, perhaps you should just do it ;-) – Steve Jessop Mar 15 '10 at 04:53
  • @Steve Jessop: I think you are misinformed regarding immutability. First, Common Lisp programs mutate a great deal; Scheme has more of a culture of minimizing mutability, or least Structure and Interpretation of Computer Programs does. Secondly, this doesn't matter: it is very hard for a GC to verify what structures are mutable and what structures are not mutable in a live Lisp program, so it has to assume that all structures that can be modified will be. Do you understand that? You seem really confused on this point. – msalib Mar 15 '10 at 04:58
  • @msalib: I don't see the relevance of the difference. Even if you won't admit that GC was a bad fit at the time for any systems programming other than the kernel, the point stands that it therefore would have made C unfit for its designed purpose. And `malloc` delay != GC delay: believe me, I have seen what happens on limited systems when an even fairly sophisticated garbage collector intrudes, and it is orders of magnitude worse than the worst case of a fairly sophisticated `malloc` implementation. – Steve Jessop Mar 15 '10 at 04:59
  • @Steve Jessop: Ritchie intended C for both kernel development and userland systems programming. The vast majority of C/C++ programs in use today are not operating system kernels. I suspect that was true in the 70s as well. So omitting capabilities that would have made the language safer, more reliable, and faster to develop for most programs doesn't seem like the best idea, especially since GC could have been optional. That would not have compromised kernel programming at all. – msalib Mar 15 '10 at 05:06
  • Perhaps if you were to outline what was necessary. And how it is superior to Boehm's library-based approach which (once he got it working) was available to any program that wants optional GC. Then go back in time to the C89 standardisation process, prove to Ritchie he's wrong, and Stroustrup it's easier than he thinks so he should just shove it in C++ too, and you're done. – Steve Jessop Mar 15 '10 at 05:28
  • @SteveJessop: "The specific advantage Lisp has is that anything immutable lends itself readily to incremental GC". Although modern functional programming languages (OCaml, F#, Haskell etc.) make heavy use of immutable data structures I do not believe Lisp did. Scheme really introduced the idea when it spun off from Lisp in the 1970s. Also, I don't think incremental collection was invented until Dijkstra's on-the-fly in 1975 so the stop-the-world GCs of the time would not have been affected by mutations anyway. – J D Jun 24 '13 at 22:44
  • @msalib: "can you explain why it is easier to generate GC-friendly Lisp code than C code?". Lisp largely abstracts away the internal representation so the language implementer can easily shape their concrete data representation to allow any form of garbage collection, e.g. to make the stack and heap traversible in order to support tracing garbage collection. In contrast, C is deliberately very explicit about concrete memory representations (in order to be a good systems programming language) which makes it very hard to traverse the heap, e.g. you can easily smuggle pointers. – J D Jun 24 '13 at 22:50
2

A simple fact is that there is no silver bullet. GC does not resolve all memory/performance problems yet.

Lex Li
  • 60,503
  • 9
  • 116
  • 147
2

If you don't know why, it's because of money. In early days computers were expensive and programmers cheap. Now it's 180 degrees different - computers are cheap and programmers expensive. GC needs a bit of CPU to do his job.

Also, most GC need to make program freeze sometimes to perform full sweep. In a real-time software - industrial monitoring, stock market and so on - this is not an option. And sometimes clients could see it too in one of apps I co-developed (ASP.NET website sometimes froze for a minute or so).

Another reason - nobody's perfect and GC could potentially get some leaks. If you write carefully with some non-GC languages that is not likely.

Migol
  • 8,161
  • 8
  • 47
  • 69
  • Actually, run of the mill "programmers" are cheaper now than they've ever been... *competent* programmers, now that's another story. – Lawrence Dol Mar 15 '10 at 21:24
0

More modern languages like C# and java have Garbage Collection because it's easier to write code if you don't have to worry about memory management. Older languages don't. There are also many applications (e.g. embedded applications running without any access to virtual memory) where you need to manage exactly how much memory your application will use and for these a language like C++ is more appropriate. Real-time applications may also limit your ability to use a garbage-collected language as you need to be in full control of how quickly your application responds at any time.

Ian Mercer
  • 38,490
  • 8
  • 97
  • 133
  • 1
    The realtime argument doesn't seem right: most C/C++ allocators do not have guaranteed realtime performance bounds and there are realtime GC implementations available. If you're doing realtime work, you need to use special memory management implementations, no matter what language you use. – msalib Mar 15 '10 at 03:12
  • I think it's correct as written: "Real-time applications MAY limit your ability to use a garbage-collected language". And yes, agree, even when you are using a non-GC language you still need to do work to avoid memory allocations and indeed any other long-running operation during time critical sections of your code. – Ian Mercer Mar 15 '10 at 03:40
  • 2
    I'm afraid the realtime argument is very right. Take video games for example. Running a scripting language which uses garbage collection (Lua for example) can cause a garbage collect while allocating memory. This can cause the CPU to spike and spend longer than a frame to execute. This will lead to visible frame rate lag. – Cthutu Mar 15 '10 at 13:21
  • 1
    @Cthutu But that has nothing specifically to do with garbage collection. That applies to any library that can incur unbounded pauses. Garbage collectors can but they don't have to... – J D Jun 24 '13 at 23:02
  • @Ian - older languages do have garbage collection. In fact the second oldest language that is still in use invented garbage collection (Lisp). – Cthutu Jun 28 '13 at 14:15
  • @Jon - you can generalise that way. You can also have incremental garbage collectors but you still have to be careful when you use them and tune them correctly to avoid memory spikes. The best system I've seen to date is ARC in Objective-C that takes the best of both worlds: Deterministic memory use with (almost) GC programming feel. – Cthutu Jun 28 '13 at 14:17
0

C, C++, Java and C# were created at different points in time (and in that order). Both Java and C# have garbage collection.

Generally, more recently developed languages tend to have better support for memory management because the state of art has advanced each time.

Eric J.
  • 147,927
  • 63
  • 340
  • 553
  • 2
    Given that Common Lisp and Smalltalk were developed before C and C++, this doesn't make much sense. – msalib Mar 15 '10 at 03:07
  • @msalib See Steve Jessop's comment on this answer: http://stackoverflow.com/questions/2444791/why-other-languages-dont-have-automatic-garbage-collection-similar-as-that-of-th/2444829#2444829 – Tyler Mar 15 '10 at 03:24
  • @msalib: This is bogus, like all of your other comments here. Development on C started in the same year as development on Smalltalk, and C appeared on the mass market **much** earlier. Lisp was aimed at a different market entirely; it is/was a great language in its own right but has virtually no relevance to this answer or to the topic in general. – Aaronaught Mar 15 '10 at 03:29
  • @msalib: I state that more recent languages tend to have better support for memory management. I was actually a very early user of Smalltalk and am quite familiar with it. However, the vast majority of languages of the same time period did not offer garbage collection. The vast majority of languages created in the last several years do offer garbage collection. Just because you can point to a 120 year old heavy smoker doesn't mean that heavy smoking makes you live long. – Eric J. Mar 15 '10 at 03:52
  • Thanks for clarifying. I'm not sure what inference you think we should be drawing: why does the fact that C is an old language explain why it doesn't have GC? Clearly it wasn't a global ignorance of the existence of GC. – msalib Mar 15 '10 at 03:59
0

There is actually a GC for C and C++ here.

But in general, the C/C++ communities developed an aversion from day one to learning about many successful programming language features that were widely used in the dynamic language communities, starting with GC. In part, I think this phenomena emerged from the culture of Bell Labs; you had a bunch of hard core hard driving whip smart people who were convinced they knew better and that they didn't need any language features to reduce defect rates. That's why C strings are a nightmarish security hole still creating massive security problems today: because a bunch of Bell Lab hackers knew, just knew, that they could write safe secure code even though the API was made out of razor blades and nitroglycerin. Perfect programmers don't need netstrings and perfect programmers don't need a GC. Too bad that there are no perfect programmers. Confidence is important , but humility keeps us from destroying ourselves.

msalib
  • 440
  • 3
  • 6
  • GC is hardly unique to the "dynamic language community". Java and C#/.NET are not dynamic languages (well, .NET is getting there, but only in small doses). Nor are C and C++ owned or represented by a single unified "community." This entire rant seems to be predicated on a series of false assumptions. – Aaronaught Mar 15 '10 at 03:06
  • 1
    Aaronnaught, at the time C and C++ were being designed, almost all the serious languages using GC were dynamically typed. Right? – msalib Mar 15 '10 at 03:09
  • @msalib: The only other language I know of using GC at the time was Lisp, and in order to run Lisp you needed a massively expensive Lisp machine. There was no conceivable way that the C/C++ language designers could have included that feature and still end up with a product that would be useful in their target market. Even if you ignore all this, your tirade is still a massive and implausible leap in logic. I'm *sure* you know exactly what the "hackers" at Bell Labs were thinking, yep. Who are you again? – Aaronaught Mar 15 '10 at 03:13
  • Aaronnaught, Smalltalk existed. It was a real language developed before C and long before C++. And while Lisp Machines were one implementation of Lisp, there were other implementations that ran on stock hardware. – msalib Mar 15 '10 at 03:24
  • 1
    @msalib: As I replied to your other comment, Smalltalk did **not** exist before C, and C++ was modeled after C, not Smalltalk (obviously). We really need a garbage collector for SO. – Aaronaught Mar 15 '10 at 03:31
  • 1
    @Aaronnaught: Smalltalk and C were developed at about the same time. Lisp was developed long before both C and C++. Since GC was an important feature of both Lisp and Smalltalk, it is not rational to argue that GC didn't appear in C/C++ because it was a novel technology that no one understood. As for who I am, I'm a guy who actually interned at Bell Labs for a while. The place was full of whip smart people, some of whom were a bit arrogant. That's hardly an novel observation. – msalib Mar 15 '10 at 03:44
  • Nobody is arguing that C/C++ weren't designed with GC because the designers didn't know about it. That is a red herring that originated from you. What people are saying is that GC was not a good fit for the performance/resource constraints of the time, and the fact that non-trivial Lisp software needed special hardware demonstrates that point pretty clearly. Smalltalk is a non-issue here as it was virtually unknown until the 1980s. (I also seem to recall reading that Smalltalk didn't have a GC until Smalltalk-80.) – Aaronaught Mar 15 '10 at 04:24
  • @Aaronaught: I know some of the guys who worked at Symbolics. One of them worked on Symbolics' first Lisp compiler for non-Lisp Machine hardware. That very first compiler, thrown together by two guys over 3 months, missing lots of optimizations, blew away the Lisp machine hardware in terms of performance. So no, you didn't actually need custom hardware to make Lisp GC fast, even 30 years ago. – msalib Mar 15 '10 at 04:39
  • And again we're back to anecdotal "evidence." Your profile says you're 31, so this is clearly not first hand, and the story must have been 20 years old when you heard it - and even then, 1980 was a very different year from 1970. Prove it. Prove that there was a reasonably-well-known Lisp compiler in the early 1970s that could hold a candle to C's performance (both CPU and memory usage). I know the name Symbolics but as far as I know they mainly made Lisp machines. – Aaronaught Mar 15 '10 at 04:49
  • @Aaronaught: I discussed this with my Symbolics buddies two years ago. I mean, I spoke with the specific programmer who wrote the first Symbolics compiler. His office was next door to my office. My point was just that early Lisps did not need hardware support for acceptable performance, since Lisp running on commodity hardware outperformed them. – msalib Mar 15 '10 at 04:54
  • So you're recounting a story you heard two years ago about something that supposedly happened 30 years ago when we are actually talking about the conditions 40 years ago. Also it doesn't seem to be documented anywhere and isn't even answering the same question (this is about the difference between Lisp and C, not Lisp on a Lisp machine vs. Lisp on a non-Lisp machine). This is not what I'd call solid historical evidence. I stand by my original assertion that garbage collection was simply not feasible on conventional 1970s hardware given the techniques known at the time. – Aaronaught Mar 15 '10 at 05:01
  • @Aaronaught: I really don't think the story is that shocking. Software running on commodity hardware tends to outperform niche hardware. Commodity hardware can innovate faster because it has access to larger markets. That's the history of microcomputers in a nutshell. Since I didn't think you knew about that trend, I tried explaining with a more specific example. But either way, your point that super special lisp machine hardware was absolutely necessary for high performance is wrong. – msalib Mar 15 '10 at 05:15
  • My point about Lisp requiring dedicated hardware in the 1970s is wrong because you have an unverified anecdote about a specific Lisp compiler in the 1980s performing better on commodity hardware, with no details as to what aspects were measured or how. Yes, I see my mistake clearly now: it was a mistake to hope for a fact or two to supplement the wall of rhetoric. – Aaronaught Mar 15 '10 at 05:27