1

I want to analyze moving 1 2 3 4 to 3 1 2 4 (list of integers) using LinkedList or ArrayList.

What I have done:

aux = arraylist.get(2); // O(1)
arraylist.remove(2); // O(n)
arraylist.add(0, aux); // O(n), shifting elements up.

aux = linkedlist.get(2); // O(n)
linkedlist.remove(2); // O(n)
linkedlist.addFirst(aux); // O(1)

So, in this case, can we say that they are the same or am I missing something?

JorgeeFG
  • 5,651
  • 12
  • 59
  • 92
  • Try writing a benchmark with some very large data sets and seeing what the behavior is. With only four elements it'll be hard to really see a difference. – erapert May 10 '17 at 18:02
  • For `arraylist.remove(2)` you forgot shifting elements down. The shifting of elements is a very expensive operation. In any case, you're mixing the complexity of operations that are very different in nature (searching and copying), so it doesn't make much sense to compare `ArrayList` and `LinkedList` on a single scale. – janos May 10 '17 at 18:06
  • 3
    One thing you've overlooked is that `remove()` returns the removed item, so for your linked list there's no reason to do a separate `get()`. You can just `linkedlist.addFirst(linkedlist.remove(2));` – azurefrog May 10 '17 at 18:09
  • @azurefrog thanks, that is a very good tip. I'm getting rid of one O(n) – JorgeeFG May 10 '17 at 18:12
  • Couldn't you theoretically save the elements in reversed order in the ArrayList and then just read from last to first, so the addFirst would become a simple add and because the list already has space for it the complexity would be the same again as for the LinkedList (but of course much less practical). – maraca May 10 '17 at 18:23

3 Answers3

1

You can indeed say this specific operation takes O(n) time for both a LinkedList and an ArrayList.

But you can not say that they take the same amount of real time as a result of this.

Big-O notation only tells you how the running time will scale as the input gets larger since it ignores constant factors. So an O(n) algorithm can take at most 1*n operation or 100000*n operations or a whole lot more, and each of those operations can take a greatly varying amount of time. This also means it's generally a pretty inaccurate measure of performance for small inputs, since constant factors can have a bigger effect on the running time than the size of a small input (but performance differences tend to be less important for small inputs).

See also: What is a plain English explanation of "Big O" notation?

Community
  • 1
  • 1
Bernhard Barker
  • 54,589
  • 14
  • 104
  • 138
0

Here's a quick and dirty benchmark. I timed both operations and repeated one million times:

import java.util.ArrayList;
import java.util.Arrays;
import java.util.LinkedList;

public class Main {

    public static void main(String[] args) {
        int arrayListTime = 0;
        int linkedListTime = 0;
        int n = 10000000;
        for (int i=0; i<n; i++) {
            ArrayList<Integer> A = new ArrayList<>(Arrays.asList(1,2,3,4));
            long startTime = System.currentTimeMillis();
            int x = A.remove(2);
            A.add(0, x);
            long endTime = System.currentTimeMillis();
            arrayListTime += (endTime - startTime);

            LinkedList<Integer> L = new LinkedList<>(Arrays.asList(1,2,3,4));
            long startTime2 = System.currentTimeMillis();
            int x2 = L.remove(2);
            L.addFirst(x2);
            long endTime2 = System.currentTimeMillis();
            linkedListTime += (endTime2 - startTime2);
        }

        System.out.println(arrayListTime);
        System.out.println(linkedListTime);
    }
}

The difference is pretty small. My output was:

424
363

So using LinkedList was only 61ms faster over the course of 1,000,000 operations.

jmhummel
  • 109
  • 1
  • 7
  • Note that this benchmark is ignoring cache performance, because for both ArrayList, and LinkedList loops - they are likely already in CPU cache before starting the timing. This needs to be verified, but I believe cache is going to be the biggest contributor to performance here. – amit May 10 '17 at 19:36
0

Big O complexity means nothing when the input is very small, like your case. Recall that the definition of big O only holds for "large enough n", and I doubt you have reached that threashold for n==4.

If this runs in a tight loop with small data size (and with different data each time), the only thing that would actually matter is going to be cache peformance, and the array list solution is much more cache friendly than the linked list solution, since each Node in a linked list is likely to require a cache seek.
In this case, I would even prefer using a raw array (int[]), to avoid the redundant wrapper objects, which triggers more cache misses).

amit
  • 175,853
  • 27
  • 231
  • 333