1

I have a situation where I have to insert a lot of records into the table. I have wrapped the insert stuff in a member function(of an object). But it takes a lot of time. To optimize this, I am keeping all these records in a buffer(as a list) as a data member of the class. When the no. of records reaches to 100, I am doing a bulk insert.

Here is the code.

class TabDailyQuotes:

    def __init__(self, metadata):
        self.metadata = metadata
        self.indexes = Table('DailyQuotes', metadata, autoload=True)
        self.ins_buf = []
        self.ins = self.indexes.insert()

    def insert(self, dict_data):
        if isinstance(dict_data, types.DictType):
            for key, val in dict_data.items():
                self.ins_buf.append(val)
        else:
            self.ins_buf.append(dict_data)
        if len (self.ins_buf) == 100:
            self.ins.execute(self.ins_buf)
            self.ins_buf = []

    def __del__(self):
        if len (self.ins_buf) > 0:
            self.ins.execute(self.ins_buf)
            self.ins_buf = []

Here the problem is that when I call the function for the insert of only one record (and subsequent call to insert subsequent records), I have to use __del__() magic function to insert all left over records at the end of life of the object.

Can anybody tell me a better way to deal with this ?

Martijn Pieters
  • 1,048,767
  • 296
  • 4,058
  • 3,343
asit_dhal
  • 1,239
  • 19
  • 35

1 Answers1

0

It's not guaranteed __del__ will ever be called. So it's best to be explicit about the clean up action and define a distinct method for it, then call it when the program shuts down or the TabDailyQuotes object goes out of scope.

References:

I don't understand this python __del__ behaviour
Is it really OK to do object closeing/disposing in __del__?

Community
  • 1
  • 1
XORcist
  • 4,288
  • 24
  • 32