In python pandas, does the chunksize matter when reading in a large file?
e.g.
df = pd.DataFrame()
for chunk in pd.read_csv('example.csv', chunksize=1000):
df = pd.concat([df, chunk], ignore_index=True)
Whether I set chunksize to a large or small number, will that help the file load quicker overall?