0

I would suspect that delete[] is O(n) and the others are O(1), but I want to be sure. Then again, it doesn't seem right that you would get constant time for allocating a variable amount of memory.

EMBLEM
  • 2,207
  • 4
  • 24
  • 32
  • The complexity for allocate is related to O(n), since the number of physical pages that have to be mapped into a virtual address depends on the allocation size. There's also a linked list of allocated memory headers and possibly a linked list for freed memory headers. For X86,the page size is 4K, but I'm not sure what the minimum allocation is. Freeing memory doesn't always unmap and free virtual pages, and may just move a header node from the allocated list to the freed list. When a process ends, virtual addressing for that process is shut down, and all of the physical pages are freed. – rcgldr Mar 13 '15 at 22:49
  • 1
    @rcgldr - on some operating systems, physical pages are not allocated immediately. – Oliver Charlesworth Mar 14 '15 at 00:22

1 Answers1

-3

None. None of those functions are algorithms. You're not guaranteed any particular complexity from them. They have the complexity of whatever underlying algorithm is used by the implementation.

jalf
  • 243,077
  • 51
  • 345
  • 550
  • 3
    True for these particular functions, but it is also important to note that for many functions in the Standard library, a particular complexity is required (even though the algorithm isn't specified, one of the ones with that complexity -- or better -- needs to be used in the implementation). – Ben Voigt Mar 13 '15 at 22:23