I have an application that a user can upload .CSV files too, the API will read the .CSV file and for each row determine if it needs to be inserted into the database.
I loop over these rows in the CSV and add the items to a slice, once the slice size hits 500, I run a GORM CreateInBatches method and then wipe the slice.
My issue is that the memory usage is stacking for each file that the user uploads, and I have narrowed it down to an issue with the slice and resetting it.
I initialise a slice and I will be wiping this slice at 500.
insertMain := make([]*models.MainEntry, 0)
I then loop over the .CSV
for {
row, err := parser.Read()
if err == io.EOF {
break
}
if err != nil {
log.Fatal(err)
}
if (len(insertMain) >= 500) {
u.db.CreateInBatches(insertMain, 500)
insertMain = nil
}
// Processing happens here to determine if the row needs to be inserted.
// If the row needs to be inserted this is how I add it to the slice
entity := &models.MainEntry{
SettlementDate: &date,
SettlementPeriod: &period,
Zone: &zone
}
insertMain = append(insertMain, entity)
}
The reason I know its the slice causing an issue is that if I remove the .append
part of the code so nothing gets added. The memory usage is not used at all.
How can I wipe the slice and remove all elements from the memory when 500 items has been hit?