1

What is the big O notation for this problem? Why? I think it's O(N) because we have one loop here but I'm not sure.

public static void mystery(List<String> list){

    for(int i = 0; i< list.size()-1; i+=2){

        String first = list.remove(i);

        list.add(i + 1, first);

    }

}
OmG
  • 18,337
  • 10
  • 57
  • 90
  • This looks like a homework. We're not here to solve your homework problems. At least show us that you've tried something. – Luiggi Mendoza Feb 09 '13 at 05:02
  • 5
    Completely impossible to tell, because you're using the `List` interface rather than a concrete. – Brian Roach Feb 09 '13 at 05:16
  • Untrue, at least with the built in list implementations. Even a linked list must traverse nodes to get to an index – Kyle Feb 09 '13 at 05:23
  • @Kyle, List is an interface. You don't know which one is being used. It's not just LinkedList or ArrayList (http://docs.oracle.com/javase/6/docs/api/java/util/List.html), and the program can implement its own... – thang Feb 09 '13 at 05:48
  • @thang Exactly why I said 'with built in list implementations'. – Kyle Feb 09 '13 at 05:54

3 Answers3

3

The time complexity is O(n^2).

The loop will execute floor(n/2) times. List.add(int i) and list.remove(int i) are both O(n) on average(see the notes on why below). This results in O(n*n/2) which is O(n^2).

Some notes on built in List implementations

When calling List.add(int i, element) or List.remove(int i) on an ArrayList, elements in the list must be shifted to insert or remove an element(when not at the end of the list) On average the number of shifts necessary is n. Thus the add and remove operations are both O(n).

List.add(int i, element) and List.remove(int i) are also O(n) when called on a LinkedList. This is due to the need to traverse to the given index in order to remove/add an element.

We can do better when we know that adds/remove to a given List will be sequential. A ListIterator, using a LinkedList, can be used to reduce the time complexity to O(n). The add and remove methods are O(1) when invoked on a LinkedLists ListIterator as there's no need to traverse to the given index.

Sample implementation of the askers method using ListIterator

public static void mystery(List<String> list){
    ListIterator<String> iterator = list.listIterator();
    int i = 0;
    while (i < list.size()-1) {
        String first = iterator.next();
        // Remove first
        iterator.remove();
        // Skip an element
        iterator.next();
        // insert at i+1
        iterator.add(first);
        i+=2;
    }
}
Kyle
  • 1,019
  • 1
  • 10
  • 19
  • *List.add(int i) and list.remove(int i) are both O(n) on average* -- remark that we do not know the run time of remove on average for ArrayList, although it is unclearly stated to be linear by the documentation. However, add on average is constant time (http://docs.oracle.com/javase/6/docs/api/java/util/ArrayList.html - *The add operation runs in amortized constant time*). It is conceivable that one can implement an ArrayList in such a way that both add and remove for that specific case yields constant time. How the SDK implements it is not revealed in the documentation. – thang Feb 09 '13 at 06:04
  • @thang the amortized constant time comment in the docs refers to the add(element) method, not the add(int, element) method(not the end of the list) The underlying array has to be resized hence the amortized comment – Kyle Feb 09 '13 at 06:27
  • You may be right, but it's actually not stated to be the one add over the other. It is likely that it's just implementing an array without any optimizations, in which case you are right. As I said, it is unclear, and it is conceivable to have an ArrayList implementation where both add and remove in that specific scenario are constant time. It would be an interesting exercise to measure and confirm.... – thang Feb 09 '13 at 06:32
  • @thang I'm not sure I understand how an arraylist can implement add and remove for any given index in constant time. It's one of the core tradeoffs between an array backed list and a linked list. Give it a look and see what you find though. – Kyle Feb 09 '13 at 06:37
  • Suppose when you remove, instead of moving elements, it flags that spot as removed. When you add, it finds the closest unused spot and adjusts only up to there. If that optimization were made here, then both add and remove would be constant time in this particular case. There would be sparse adjusters to compensate for indices due to removed elements. It's a stretch of the imagination, but again, without measurement and actual statement in the documentation... well, make no assumptions. – thang Feb 09 '13 at 06:41
  • @thang: That's still O(N). The difference is that there's a smaller fraction of N. – cHao Feb 09 '13 at 06:43
  • Finding the closest unused spot is not constant and acceses are now not constant. See: http://stackoverflow.com/questions/5682218/data-structure-insert-remove-contains-get-random-element-all-at-o1 for an interesting structure using a hash table – Kyle Feb 09 '13 at 06:45
  • @cHao, not in this example... on average, you're right that is O(N), but when you apply that to this example, it so happens that every add requires only a single swap...and hence the **in this particular case**. it just so happens that you always add 1 element away from the removed element... – thang Feb 09 '13 at 06:45
  • @thang: You still have to find the nearest empty entry. If you have to decide whether to go forward or backward (which you'd have to do in order to use that recently deleted slot), you still need to iterate. Plus, once entries no longer map directly to indexes, you lose the benefits of an ArrayList -- it becomes an O(N) operation to even *find* element N. Unless you maintain a mapping of indexes to entries, which makes insertion O(N) again (as you'd need to update indexes whenever you insert). – cHao Feb 09 '13 at 07:12
  • @cHao, not contending that it's not O(N) if you add anywhere. Just that you're using a very specific case here. If the code were randomly removing and adding, you're absolutely right, but it's not doing that. – thang Feb 09 '13 at 07:16
  • @cHao you are **completely** wrong..your `O(N)==O(N/2)` makes me laugh! – Anirudha Feb 09 '13 at 07:48
  • @Some1.Kill.The.DJ: http://en.wikipedia.org/wiki/Big_O_notation#Example . "If f(x) is a product of several factors, any constants (terms in the product that do not depend on x) are omitted." N/2 is simply N * 1/2. So when we're talking Big-O, O(N/2) *is* O(N). – cHao Feb 09 '13 at 10:06
  • @thang: If we're given specs or code for this hyper-specialized container, and they indeed show constant-time inserts for this particular job, then we can say O(1). Til then, because of how lists work, we can safely say O(N) (which *includes* O(1), as the notation describes an upper bound and thus includes smaller values). – cHao Feb 09 '13 at 10:32
  • @cHao, sure, but the tighter the bound the cooler it is... :p in fact, we don't know if other built in List implementations have lower run time.... that's my whole point in this whole discussion -- not that the solution is wrong, but make no assumptions. so for example, if someone gave you that code to look at and it is mission critical code, and the immediate conclusion is that it is O(N^2), what if your assumption about the SDK doesn't hold... – thang Feb 09 '13 at 10:33
  • @thang: They're not just assumptions. :) The biggest assumption is that the Java team knows about complexity. Authors of container classes almost invariably document the time complexity of their operations, as that's often what makes the container fit or not fit for a given task. If it's really critical code, and performance is that important, then either we have documentation and/or the container's source code in hand, or the code's wrong -- no matter how fast it currently is -- because you're relying on magic and unwarranted assumptions and details that could change tomorrow. – cHao Feb 09 '13 at 19:17
0

It looks like it's O(n^2). It iterates through a list, and for each iteration, it calls list.remove() which also runs in O(n). Therefore the time complexity for this function should be O(n^2).

cHao
  • 84,970
  • 20
  • 145
  • 172
William Rookwood
  • 297
  • 3
  • 15
  • 5
    list.remove depends on which list implementation it is, for linked list it is O(1) – Oleg Mikheev Feb 09 '13 at 05:13
  • 2
    @Oleg Mikheev. That's only true using an Iterator or at the start and end of a linkedlist – Kyle Feb 09 '13 at 05:20
  • I wonder if the compiler is smart enough to notice that you are doing sequential item swapping... and optimize list access appropriately. – thang Feb 09 '13 at 05:46
  • @thang: I highly doubt it. It'd need to be psychic to know whether it could safely muck around like that with a class it's not compiling. – cHao Feb 09 '13 at 10:44
0

When running this method, it will excute:

remove 0
append 1
remove 1
append 2
...

Assume List is Array List:

assume all remove elements always at last So: n/2 * (2*n + n) = 3 * n^2 /2 => O(n^2)

assume all remove elements always at top. So: n/2 * (n + n) => O(n^2)

So. because the worst case and best case always O(n^2), so, not only Big-O, but the complexity will be o(n^2).

hqt
  • 29,632
  • 51
  • 171
  • 250
  • Not when its a linked list. It will be O(1) – Shivam Feb 09 '13 at 05:20
  • @ShivamKalra not really. if all elements remove at last, you still need time trace to the end. – hqt Feb 09 '13 at 05:23
  • @ShivamKalra: Try removing from a linked list at an index, sans iterator, without counting your way to the entry (read: running in O(N)). – cHao Feb 09 '13 at 05:23
  • @cHao Right. Only when you've reference of that node then its O(1). – Shivam Feb 09 '13 at 05:27
  • Remove and add are implemented in constant time for a LinkedList using the ListIterator – Kyle Feb 09 '13 at 05:42
  • 1
    I think it should be remove 0, append 1, remove 2, append 3, etc. not what is shown there... – thang Feb 09 '13 at 05:45