I have a situation where I have to insert a lot of records into the table. I have wrapped the insert stuff in a member function(of an object). But it takes a lot of time. To optimize this, I am keeping all these records in a buffer(as a list) as a data member of the class. When the no. of records reaches to 100, I am doing a bulk insert.
Here is the code.
class TabDailyQuotes:
def __init__(self, metadata):
self.metadata = metadata
self.indexes = Table('DailyQuotes', metadata, autoload=True)
self.ins_buf = []
self.ins = self.indexes.insert()
def insert(self, dict_data):
if isinstance(dict_data, types.DictType):
for key, val in dict_data.items():
self.ins_buf.append(val)
else:
self.ins_buf.append(dict_data)
if len (self.ins_buf) == 100:
self.ins.execute(self.ins_buf)
self.ins_buf = []
def __del__(self):
if len (self.ins_buf) > 0:
self.ins.execute(self.ins_buf)
self.ins_buf = []
Here the problem is that when I call the function for the insert of only one record (and subsequent call to insert subsequent records), I have to use __del__()
magic function to insert all left over records at the end of life of the object.
Can anybody tell me a better way to deal with this ?