There is not one correct answer to this.
In some contexts, people may analyze a single arithmetic operation as being "one" operation. Other times, they may care a lot exactly how many cycles an operation will take, and they may want to say that e.g. adding ints is cheaper than multiplying floats.
In many contexts, programmers say that dereferencing a pointer is O(1)
. However, if you really are considering the case when n
goes to infinity, if your n
is like the size of the universe, does it still make sense to say that dereferencing a pointer into that is O(1)
operation? In some theoretical contexts, people will say that dereferencing a pointer is O(log n)
essentially.
However, I talked to a physicist once who said that even O(log n)
possibly is not physically realistic -- he says there is a maximum information density that the universe can have in a given volume space. This is called the "Bekenstein Bound", it turns out that the maximum entropy is achieved by a black hole. If you assume that in a volume of space n
you can only have O(n)
bits of memory, it basically means that in a galactic-sized computer, bits have to be packed like balls in the sphere packing problem, and then the distance between a typical pair of bits is growing as O(n^{1/3})
. Since information doesn't go faster than speed of light... maybe dereferencing a pointer needs O(n^{1/3})
asymptotically? O.o
Point is, don't overthink this. Any thing that you do in asymptotic analysis is ultimately an approximation, you are just trying to get a useful idea of how long it will take. Things that you are pretty sure won't really matter in your case, you can analyze as O(1)
, in order to focus on the things that do matter. In another situation, those same things you may want to analyze differently, if they represent "most" of the work.
IMO if you want to know how to answer questions about this stuff in a job interview, you should be prepared to analyze it in multiple different ways, and show them that you understand that things like "does int i = 0;
count as one operation or two" are basically a convention.