I am working with Sqlite database using sqlalchamey. It is quite a large db and is slowing down my operations when loading it multiple times i.e. on each iteration of a loop. I only actually need the last 10 rows of data for my calculations. How can I load only those last few rows and convert to pandas dataframe using pd.read_sql
. I have tried using the stream_results functionality with chunksize as follows
db_name = "SELECT * FROM" + " " + (print_name + "_df")
conn = db_engine.connect().execution_options(stream_results=True)
for df_10 in pd.read_sql(db_name, conn, chunksize=10):
df = df_10
this works in that df_10 returns the last 10 rows however the operation still has to iterate over each chunk which takes longer than loading the whole database in one go. Is there any way to just pull the latest 10 rows without having to either load the whole lot or split it an load it in n chunks of 10?