1

I have a piece of code in Java that I have been struggling to understand what is the time complexity... In the best case scenario I believe it might be O(n) and the worst case probably O(n^2) because it could go in recursion for n times... Is this right??

All the other methods are O(1)

public void associateblocks() {
    Block block = head;
    while (block != null) {
        Block block2 = block.getNextBlock();
        if (block2 != null) {
            if (block.getSize() == block2.getSize() && block.isFree() && block2.isFree()) {
                block.setSize(block.getSize() * 2);
                block.setIndex((block.getIndex() - 1) / 2);
                block.setNextBlock(block2.getNextBlock());
                associateblocks();
                freeBlocks.remove(block2);
            }
        }
        block = block.getNextBlock();
    }
}
Pedro Martins
  • 125
  • 2
  • 9
  • 2
    It’s impossible to answer without seeing the code for all the methods – cpp beginner Mar 03 '18 at 17:25
  • 2
    In the worst case, it could be an infinite loop. I agree with cpp. – JB Nizet Mar 03 '18 at 17:27
  • All the other methods are O(1) will edit post – Pedro Martins Mar 03 '18 at 17:27
  • 1
    Is this code not causing `stackoverflow exception`? – tryingToLearn Mar 03 '18 at 17:30
  • Code compiles and executes as intended. I cant give full source code unfortunately – Pedro Martins Mar 03 '18 at 17:32
  • why does the method call itself recursively, what is the reason for that? – luk2302 Mar 03 '18 at 17:34
  • I think that without understanding what in the world it's doing, and what the underlying data structure is (what is `head` and `freeBlocks`?) it's not going to be possible but to hand-wave an estimate. Why don't you manually work through it with a few simple inputs and see if you can't extrapolate? (The trick will be that nested recursive call.) – Roddy of the Frozen Peas Mar 03 '18 at 17:34
  • Well i can try to give some insight on what it is doing. I programmed a simple buddy system algorithm to allocate memory. A block is a memory request and has size, id, and reference to next block. This method is to find blocks that are free and of the same size, if yes, then I will merge them together in a block with double size. I call this function again because this merge of the blocks can originate other possible merges... Sorry i cant explain better. I realise now it might be impossible for someone to answer without full source code. – Pedro Martins Mar 03 '18 at 17:39
  • It’s starting to make some sense, although I still don’t understand what the index of a block is. – cpp beginner Mar 03 '18 at 17:44
  • Well I forgot about index, I made index a private attribute of a block purely for debugging some situations I had totally uncorrelated to this situation. – Pedro Martins Mar 03 '18 at 17:49
  • Also the recursive call doesn’t feel right to me. It means you repeat a lot of work by repeatedly starting from the beginning of the list. It feels like you should either have a loop, or a recursive call, not both. – cpp beginner Mar 03 '18 at 17:49
  • I dont know how to do it better. Suppose I have in my list this blocks all free (8,8,16,32). Block head is the first 8. I merge 8 with 8 i get (16,16,32) My new head is 16, i merge 16 with 16 (32 32). Then 32 with 32 = (64) – Pedro Martins Mar 03 '18 at 17:52
  • Whenever you merge two blocks, you need to be able to look in both directions to see if the next merge is possible. E.g. in the examples 2, 2, 4 and 4, 2, 2 you can merge the two 4s after the 2s but the direction is different for the two cases. You could do that reasonably efficiently if you had a doubly linked list without recursion. – cpp beginner Mar 03 '18 at 17:59
  • Would implementing a doubly linked list, reduce the worst case to O(n)? – Pedro Martins Mar 03 '18 at 18:05
  • If n is the number of blocks, then yes I think so – cpp beginner Mar 03 '18 at 18:14

1 Answers1

4

Worst case is O(n^2) if you assume all blocks are "free" and they have a size of decreasing powers of 2 with last two being 1:

2^n, 2^(n-1) ... , 64, 32, 16, 8, 4, 2, 1, 1

First iteration merges the last two, triggers a recursive call that now operates on a list that looks like

2^n, 2^(n-1) ... , 64, 32, 16, 8, 4, 2, 2

merges the last two, calls recursively, now operating on

2^n, 2^(n-1) ... , 64, 32, 16, 8, 4, 4

etc. First time n loops, then n-1, n-2, ... Sum all those you get n * (n + 1) / 2 steps, or O(n^2).

Best case is O(n) if you only iterate once without recursion, doing basically nothing.

Average case is somewhere in between... I cannot calculate that one.

luk2302
  • 55,258
  • 23
  • 97
  • 137