I need to output the results of a sql query to csv. The query results will be outside of my memory resources. Also csv operations have typically been much slower in pandas then using csv library for me, so I'd prefer not to use pandas.
I have tried to create the code below with the intent to batch a list of 1000 rows and then append them to a content to a csv file. When I run it, my system just runs out of memory and it doesn't work like I anticipated.
I'm not sure what I'm doing wrong or if there is something I just don't understand.
ROWS_AT_ONCE = 1000
curr.execute(
'''
SELECT
*
FROM '''+Table_Name+'''
;
'''
)
rows = curr.fetchall()
headers = list(map(lambda x: x[0], curr.description))
headers = tuple(headers)
csv_w_pointer = open(Export_Path_Name, 'a' , newline='')
csv_writer = csv.writer(csv_w_pointer, delimiter='\t', quotechar='"')
csv_writer.writerow(headers)
batch = list()
for row in rows:
batch.append(row)
if len(batch) >= ROWS_AT_ONCE:
csv_writer.writerow(batch)
batch = list()
if batch:
csv_writer.writerow(batch)
del batch
I need to write the sql results to csv when out of memory. My current attempt is failing due to exceeding memory.