I have a function set up for Pandas
that runs through a large number of rows in input.csv
and inputs the results into a Series. It then writes the Series to output.csv
.
However, if the process is interrupted (for example by an unexpected event) the program will terminate and all data that would have gone into the csv is lost.
Is there a way to write the data continuously to the csv, regardless of whether the function finishes for all rows?
Prefarably, each time the program starts, a blank output.csv
is created, that is appended to while the function is running.
import pandas as pd
df = pd.read_csv("read.csv")
def crawl(a):
#Create x, y
return pd.Series([x, y])
df[["Column X", "Column Y"]] = df["Column A"].apply(crawl)
df.to_csv("write.csv", index=False)