What can I commit entries to an SQL database with SQLAlchemy with tolerance for errors? Committing a large batch of entries together is much more efficient, but if there is an error in one of the entries, e.g. text in an integer column, the entire batch cannot be saved to the database. My workaround below commits entries individually, but this method can create too many connections to the mysql server, particularly when run in parallel. Is there a more efficient way to commit entries as a batch with room for error?
def commitentry(database, enginetext, verbose = False):
"""
Takes a database object and text string that defines the SQL
engine and adds all entries in the database list to the SQL
database.
"""
engine = create_engine(enginetext)
Session = sessionmaker()
Session.configure(bind=engine)
session = Session()
counter = 0
for entry in database:
try:
session.add(entry)
session.commit()
except Exception, e:
print("Commit Error")
session.rollback()
if verbose:
print(e)
finally:
counter += 1
if verbose:
print(counter, counter/float(len(database)))
if verbose:
print("Entries saved!")
session.close()