I have a very large pandas df that I am attempting to insert into a MongoDB. The trouble is memory management. My code is below. I am using 'insert_many' to simply load the entire frame into the DB. This process is using a lot of memory. Is there a way to accomplish the same goal with less memory usage?
import pymongo
start = time()
client = pymongo.MongoClient()
db = client.test_db
collection = db.collection
collection.insert_many(data.to_dict('records'))
end = time()
print ("Time to Populate DB:",end-start)