0

I am currently studying data structures and I quite don't really get how to figure out Big O for methods that I implement and I am often confused by how other people get the Big O.

So for example, just today, I implemented a Hash Table in Javascript.

To handle collision I used 3 different buckets for practice

  1. objects
  2. arrays
  3. linked list

So, let's work Insert method and figure out it's Big O Notation. Please correct me if I am wrong here.

  1. objects - constant insertion so O(1) and it automatically overwrites of duplicate keys, so no need to check if any keys are duplicates, just overwrite key/value pair.

  2. arrays - if I use arrays, I first need to check for duplicate keys because if there are I would like to overwrite them. That means I would have to loop through the array bucket before inserting my key/value pair. So Big O is O(N). If the bucket is empty, it will be O(1) because I am creating a new bucket and just push the key/value pair.

  3. Linked List - this is similar to arrays. I need to traverse through my LL to see if there will be any duplicate of keys. That means it's O(N) again. And same as arrays, it will be O(1) if it's empty.

Now, from what I gather, using object seems like the best solution when handling collisions because it will always be after than arrays and linked lists.

But, every article I read, it says collisions are handled best using linked lists ( I used doubly linked list, if it matters). From what I wrote above, I don't really see the benefit of using linked list over say objects. Also, linked list doesn't seem really faster over arrays too. Am I correct on this? or am I missing something big.

Appreciate your input!

stargiraffe
  • 27
  • 1
  • 4
  • Does this helps you ? https://www.bigocheatsheet.com/ – Carlos1232 Jul 31 '20 at 15:16
  • There are other considerations besides retrieval complexity. There's the also the rest of the CRUD cycle. For instance, [Big O Notation Arrays vs. Linked List insertions](https://stackoverflow.com/q/7770569/215552), [Time Complexity of Doubly Linked List Element Removal?](https://stackoverflow.com/q/6161615/215552) – Heretic Monkey Jul 31 '20 at 15:17
  • @Carlos1232, I looked at that before, but id did not help much. I am interested in why the Big O's are what they are. Thanks though – stargiraffe Jul 31 '20 at 17:41
  • @HereticMonkey, it did help a little. So I am guessing that when you have to traverse the LL, that in itself is another operation: Search which is O(N) and the process of inserting the node is O(1), right? So for example, if I would like to delete a node, the act of deleting the node is O(1) because I simply have to just readjust the pointers, BUT I need to traverse the list: Search O(N) to get the previous node. So technically, here, Deletion needs two processes: 1. Deletion and 2. Search. When we look at Big O, we look at these separately I am assuming? – stargiraffe Jul 31 '20 at 17:46

0 Answers0