I'm using chunks to write to a local SQLite database. After it has progressed to some point, I get this error "attempt to write a readonly database". If I change the chunk size, the point at which the error comes changes as well. The database is being created as it progresses, sometimes it's 15mb before the error, sometimes it gets to 35mb. Any ideas on how to fix this or why it's occurring?
import pandas as pd
from sqlalchemy import create_engine
import datetime as dt
disk_engine = create_engine('sqlite:///C:\\databases\\test.db')
start = dt.datetime.now()
chunksize = 100000
j = 0
index_start = 1
for df in pd.read_csv('C:\my_file.txt',sep='\t', error_bad_lines=False, chunksize=chunksize, iterator=True, encoding='ISO-8859-1'):
df = df.rename(columns={c: c.replace(' ', '') for c in df.columns}) # Remove spaces from columns
df.index += index_start
columns = ['column_1']
for c in df.columns:
if c not in columns:
df = df.drop(c, axis=1)
j+=1
print '{} seconds: completed {} rows'.format((dt.datetime.now() - start).seconds, j*chunksize)
df.to_sql('data', disk_engine, if_exists='append')
index_start = df.index[-1] + 1
Output:
0 seconds: completed 100000 rows
1 seconds: completed 200000 rows
3 seconds: completed 300000 rows
4 seconds: completed 400000 rows
6 seconds: completed 500000 rows
7 seconds: completed 600000 rows
8 seconds: completed 700000 rows
10 seconds: completed 800000 rows
11 seconds: completed 900000 rows
13 seconds: completed 1000000 rows
14 seconds: completed 1100000 rows
16 seconds: completed 1200000 rows
17 seconds: completed 1300000 rows
19 seconds: completed 1400000 rows
OperationalError: (sqlite3.OperationalError) attempt to write a readonly database [SQL: u'INSERT INTO
data ("index") VALUES (?)'] [parameters: ((1300001L,), (1300002L,), (1300003L,), (1300004L,),
(1300005L,), (1300006L,), (1300007L,), (1300008L,) ... displaying 10 of 100000 total bound parameter
sets ... (1399999L,), (1400000L,))]