In continue to my previous post I'm trying to use the bulk_save_objects for a list of objects (the objects dont have a PK value therefore it should create it for each object). When I use the bulk_save_objects I see an insert per object instead of one insert for all objects.
The code :
class Product(Base):
__tablename__ = 'products'
id = Column('id',BIGINT, primary_key=True)
barcode = Column('barcode' ,BIGINT)
productName = Column('name', TEXT,nullable=False)
objectHash=Column('objectHash',TEXT,unique=True,nullable=False)
def __init__(self, productData,picture=None):
self.barcode = productData[ProductTagsEnum.barcode.value]
self.productName = productData[ProductTagsEnum.productName.value]
self.objectHash = md5((str(self.barcode)+self.produtName).encode('utf-8')).hexdigest()
Another class contains the following save method :
def saveNewProducts(self,products):
Session = sessionmaker()
session=Session()
productsHashes=[ product.objectHash for product in products]
query = session.query(Product.objectHash).filter(Product.objectHash.in_(productsHashes))
existedHashes=query.all()
newProducts = [ product for product in products if product.objectHash not in productsHashes]
/*also tried : session.bulk_save_objects(newProducts, preserve_order=False)*/
session.bulk_save_objects(newProducts)
UPDATE 1
I following what @Ilja Everilä recommended in the comments, I added a few parameters to the connection string :
engine = create_engine('postgresql://postgres:123@localhost:5432/mydb', pool_size=25, max_overflow=0,
executemany_mode='values',
executemany_values_page_size=10000, executemany_batch_page_size=500,
echo=True)
In the console I saw multiple inserts with the following format :
2019-09-16 16:48:46,509 INFO sqlalchemy.engine.base.Engine INSERT INTO products (barcode, productName, objectHash) VALUES (%(barcode)s, %(productName)s, %(objectHash)s, ) RETURNING products.id
2019-09-16 16:48:46,509 INFO sqlalchemy.engine.base.Engine {'barcode': '5008251', 'productName': 'ice ream','object_hash': 'b2752233ec523f2e874dc95b70020ae5'}