This question: Time complexity of memory allocation says that memory allocation takes non-deterministic time. Based on this, how is it valid to say, for example, that "adding an item to a dynamic array takes amortized O(1) time"? In this scenario, if adding the item causes the dynamic array to resize, then a new backing buffer must be allocated. For all we know, allocating that buffer could take O(1) time, or O(n^2) time, or some crazily large expression that dwarfs n, depending on the algorithm the memory allocator uses and its current state.
Do academic papers simply ignore the time complexity of memory allocation / assume that it is O(1) in calculating time complexity?