I am currently studying data structures and I quite don't really get how to figure out Big O for methods that I implement and I am often confused by how other people get the Big O.
So for example, just today, I implemented a Hash Table in Javascript.
To handle collision I used 3 different buckets for practice
- objects
- arrays
- linked list
So, let's work Insert method and figure out it's Big O Notation. Please correct me if I am wrong here.
objects - constant insertion so O(1) and it automatically overwrites of duplicate keys, so no need to check if any keys are duplicates, just overwrite key/value pair.
arrays - if I use arrays, I first need to check for duplicate keys because if there are I would like to overwrite them. That means I would have to loop through the array bucket before inserting my key/value pair. So Big O is O(N). If the bucket is empty, it will be O(1) because I am creating a new bucket and just push the key/value pair.
Linked List - this is similar to arrays. I need to traverse through my LL to see if there will be any duplicate of keys. That means it's O(N) again. And same as arrays, it will be O(1) if it's empty.
Now, from what I gather, using object seems like the best solution when handling collisions because it will always be after than arrays and linked lists.
But, every article I read, it says collisions are handled best using linked lists ( I used doubly linked list, if it matters). From what I wrote above, I don't really see the benefit of using linked list over say objects. Also, linked list doesn't seem really faster over arrays too. Am I correct on this? or am I missing something big.
Appreciate your input!