4

Can someone comment on how the complexity of dictionaries increases as we "nest" them further?

E.g as I add an element as follows:

dict_name[a][b][c][d] = 0

I would think the lookup time should be the same for any dictionary (constant time O(1)), but would it change significantly if I add elements like this?

dict_name[a][b][c]....[z]
albert
  • 8,027
  • 10
  • 48
  • 84
Shiv
  • 332
  • 4
  • 12
  • Are you conscious that the array of dimension 26 will hold at least 2^26=67108864 different dictionaries (assuming every dimension allows 2 distinct values; and 2541865828329 dictionaries for 3 values per dimension, which is far beyond what a PC can handle) ? –  Aug 31 '15 at 08:06
  • 1
    You don't seem to have understood the concept of `O(1)`. If you do an `O(1)` operation a fixed number of times you still get `O(1)` – skyking Aug 31 '15 at 08:35
  • 1
    [This](http://stackoverflow.com/a/22188943/3936732) SO answer explains why the constant element doesn't matter (except that it does). – David Mašek Aug 31 '15 at 08:46
  • @YvesDaoust Thanks for clarifying that.. I was curious what impact a nested dictionary would have on both time and memory :) – Shiv Sep 01 '15 at 14:58

2 Answers2

7

Python's dictionary implementation doesn't change with nesting, no, so the algorithmic complexity of a lookup does not change. As far as Python is concerned, each [key] subscription is independent from where the object you are subscribing came from.

Each lookup is still O(1). Looking up a nested element is then depth times a O(1) lookup. Since you hard-coded the depth (by using literal notation), what you have is still O(1) as a constant multiplier doesn't affect Big-O complexity.

Martijn Pieters
  • 1,048,767
  • 296
  • 4,058
  • 3,343
4
dict_name[a][b][c][d] = 0

This is essentially the same as the following

temp_a = dict_name[a]
temp_b = temp_a[b]
temp_c = temp_b[c]
temp_c[d] = 0

So you just have three lookups, in which you get an object from a dictionary which just happens to be another dictionary. And then, in the final step, you make one dictionary assignment.

As we know, dictionary lookups take all constant time on average, so all those four operations are O(1) themselves. Combined, you get 4 × O(1), which is still O(1) (because constant factors are not significant for big O).

If you now increase the depth of that nesting, all you get is more temp_y = temp_x[k] lines, which again is constant time. So you only increase the factor k in k × O(1). And as said before, as long as k is constant, the factor is not significant for big O, so you stay at constant time, regardless of your nesting depth.

poke
  • 369,085
  • 72
  • 557
  • 602