Is there a complexity guarantee for the memory allocation operations in the recent c++ standards? That is, if I have a class A, whose default constructor and destructor run in O(1), what is big-O for "new A[N]" and "delete[] A"? Is there any complexity guarantee for the new int[N]?
-
2On any real machine, I don't see how `new` could be less than O(N), since for instance the OS may have to zero out memory before giving it to you. – Nate Eldredge Feb 10 '20 at 21:07
-
2related https://stackoverflow.com/questions/23320619/new-delete-complexity – 463035818_is_not_an_ai Feb 10 '20 at 21:18
-
Also somewhat related https://stackoverflow.com/questions/282926/time-complexity-of-memory-allocation. My question has "standard" in it. – 0kcats Feb 11 '20 at 22:57
-
Initialization of array of N elements would be O(N). However, the memory allocation part is unclear. So my question is about the allocation part mostly. – 0kcats Feb 11 '20 at 23:12
-
If the standard guarantees performance of operations on the STL containers, for example vector::push_back, then one could infer worst case performance of new and delete, but then it should specify performance of "new" explicitly. – 0kcats Feb 11 '20 at 23:16
-
Notice a "not a bug" in the Visual Studio related to the memory allocation "malloc/free dramatic performance slowdown". If performance of the allocation was in the standard, at least we could argue this to be a bug. https://developercommunity.visualstudio.com/content/problem/552439/mallocfree-dramatic-performance-slowdown.html – 0kcats Sep 14 '20 at 03:01
2 Answers
I was not able to find anything which would explicitly mention complexity. I am also pretty sure that any complexity questions are moot when it comes to new operators (i.e. memory allocation per se).
There are complicated C++ runtime heap management structures, built on top of OS level memory management, which could include application-level locks, OS-level locks, file-based swaps, etc. For those reasons, the answer below does not discuss the memory allocation per se.
However, if we focus on new expression, stitching together [expr.new/22]:
A new-expression that creates an object of type T initializes that object as follows: (22.1) If the new-initializer is omitted, the object is default-initialized ([dcl.init]).
and [dcl.init/7]:
To default-initialize an object of type T means: .... (7.2) If T is an array type, each element is default-initialized.
I can conclude that complexity of such an operation would be O(N).

- 61,605
- 5
- 78
- 137
I doubt that there are any exact guarantees as too much depends on platform (imagine some really weird theoretical platform that nobody will ever make but could). However, you can make certain anticipations. If construction and deletion takes O(1)
then...
delete A[]
of size n
is either O(1)
or O(n)
depending on whether A
is trivially destructible or requires a non-trivial procedure for deletion. The O(1)
can actually be large (not truly O(1)
) as deallocation of a large piece of consequtive memory might take a bit of time but it is still insignificant in the greater scope of things.
Same is true for new A[n]
. It might take about (fake) O(1)
if A
is trivially constructible (like int
, i.e., trash data is allowed) or O(n)
if each element requires some O(1)
processing.
Generally, you shouldn't forget cases when construction and destruction are slow for A
and might take a large chunk of time.

- 4,456
- 1
- 11
- 18
-
I can imagine reasonable-sounding implementations when `delete A[]` would be O(N) even if `A` is trivially destructible. Say the system zeros out deallocated memory for security, or needs to update page table entries, etc, etc. Are you asserting that this would be forbidden by the standard? – Nate Eldredge Feb 10 '20 at 23:05
-
@NateEldredge in the first praragraph it was claimed that standart doesn't really bother with any claims. To begin with, there is no control over how much time allocation/deallocation of a given size takes. Then it's just general expectations. – ALX23z Feb 11 '20 at 06:39