ARC guarantees that objects will be automatically reference counted at compile time. It goes further and places the requirement that the code be algorithmically fully coherent (which manifests as errors when trying to convert between, say, void*
and id
via casting -- under ARC, you have to qualify the memory management policy across such casts).
ARC is not a garbage collector; there is no scanning, no threading, and no stop-the-world behavior. This means more predictable behavior at the cost of things like automatic cycle detection.
While ARC guarantees that an object's lifespan will be automatically managed, ARC does not guarantee that lifespan beyond "the object will live for at least as long, maybe longer, than it is used in the code".
In fact, you might see lifespan changes depending on both the optimization level of the code and whether or not the factory method you invoked is compiled in an ARC vs. manual-retain-release [MRR] source file. And the lifespan may change across releases of the compiler and/or runtime.
For example, ARC code calling into a factory method can sometimes short-circuit the autorelease
entirely.
Sounds scary, but it isn't because of the algorithmic coherence requirement. Since there cannot be ambiguous behavior (as there can in plain old MRR), that the lifespan might change over releases should not impact your code.
Of course, this means that you should not have order dependencies between dealloc
methods. This should not be an onerous requirement as having order dependencies between dealloc
methods under MRR was always a nasty thing.