0

I have a class which matches a schema.

class EnvelopeData(object):

  def __init__(self):
    self.envelope_data = OrderedDict()
    self.table = table_name
    self.payload = payload

  def get_envelope_data(self, use_hex_encoding):
    """
      Getting the envelope data to be sent with the actual record payload

      :param use_hex_encoding:
      :raise:
    """
    self.envelope_data["table"] = self.table.encode(STRING_ENCODING)
    self.envelope_data["payload"] = Utility.get_serialized_avro(self.table, 
                                                             hex_encoding=use_hex_encoding)

    print "For schema_name ", self.schema, sys.getsizeof(self.envelope_data["table"])+sys.getsizeof(self.envelope_data["payload"])

and also when I print

a = EnvelopeData()
a.get_envelope_data()
print sys.getsizeof(a.envelope_data)

The size of the elements inside > size of the ordereddict that contains the element. So, I`m confused as to how a dict can take lesser spaces then its constituting element.

RamPrasadBismil
  • 579
  • 2
  • 10
  • 30

1 Answers1

2

By the sys.getsizeof() documentation you can see it specifically says:

Only the memory consumption directly attributed to the object is accounted for, not the memory consumption of objects it refers to.

Meaning, the items inside the dictionary are not counted as part of the size, only the dictionary object itself.
It also links to a recursive sizeof recipe which will include the items in the dictionary.

Bharel
  • 23,672
  • 5
  • 40
  • 80