0

I am trying to print the value of a linked list in reverse order using recursion. Here is the code:

#include <iostream>

// Node structure
struct node{
    int data;
    struct node* next;
    node(int val){
        data = val;
        next = NULL;
    }
};

void print_in_reverse(struct node* head){
    if(head == NULL){
        return;
    }
    print_in_reverse(head->next);
    std::cout<<"Node value "<<head->data<<"\n";
}


int main() {
    // Make a test linked list
    // 2->4->6->9->NULL

    struct node* head = new node(2);
    head->next = new node(4);
    head->next->next = new node(6);
    head->next->next->next = new node(9);

    print_in_reverse(head);

    return 0;
}

It does the job pretty well. However, I am a little confused about the time complexity. Iterating over the linked list using recursion is O(n), I get it. But, do we also consider the time when the recursion visit the nodes again to finish the unfinished recursive state before they pop out of the stack memory? The question is: Is the time complexity of the algorithm, O(n), or O(n) + O(n)?

Pankaj Mishra
  • 550
  • 6
  • 18
  • 1
    For `O(2n)` the constant would be dropped anyway. So it is still `O(n)`. – Robert Harvey Dec 12 '19 at 18:44
  • 1
    It is still linear. `O(n)` – PaulMcKenzie Dec 12 '19 at 18:44
  • I know the constant is dropped in asymptotic analysis. But my question is, is it simple O(n) or O(2n)? @PaulMcKenzie – Pankaj Mishra Dec 12 '19 at 18:47
  • 1
    There's no such thing as `O(2n)`. Remember, big O doesn't tell you how fast your algorithm runs; it tells you how well it scales as `n` increases. And the way your algorithm scales for `O(2n)` is exactly the same as how it scales for `O(n)`, – Robert Harvey Dec 12 '19 at 18:50
  • @RobertHarvey What would happen if the constant is very big, are they still dropped? – Pankaj Mishra Dec 12 '19 at 18:51
  • 1
    Yes, because the scaling is still linear. A large constant just means a single n takes a long time to execute. – Robert Harvey Dec 12 '19 at 18:51
  • @RobertHarvey Thank you so much. – Pankaj Mishra Dec 12 '19 at 18:52
  • The - simplified - definition of O(f(n)) is "the set of all functions g where there exists a c>0 such that g(x) <= c*f(x) for all x >= some x0". One really shouldn't use big o notation without understanding the mathematical basis for it. Understanding that definition should give a much better understanding than "drop constant factors" (which can go horrible wrong quite easily) – Voo Dec 12 '19 at 18:59
  • @PankajMishra -- In the future, if someone asks for an `O(n)` solution to a problem, as long as you run through the data in a loop (`for` loop, recursion, etc.) a *constant* number of times, the solution is still `O(n)`. – PaulMcKenzie Dec 12 '19 at 19:05

0 Answers0