1

I'm using chunks to write to a local SQLite database. After it has progressed to some point, I get this error "attempt to write a readonly database". If I change the chunk size, the point at which the error comes changes as well. The database is being created as it progresses, sometimes it's 15mb before the error, sometimes it gets to 35mb. Any ideas on how to fix this or why it's occurring?

import pandas as pd
from sqlalchemy import create_engine
import datetime as dt

disk_engine = create_engine('sqlite:///C:\\databases\\test.db')

start = dt.datetime.now()
chunksize = 100000
j = 0
index_start = 1

for df in pd.read_csv('C:\my_file.txt',sep='\t', error_bad_lines=False, chunksize=chunksize, iterator=True, encoding='ISO-8859-1'):

  df = df.rename(columns={c: c.replace(' ', '') for c in df.columns}) # Remove spaces from columns

  df.index += index_start

  columns = ['column_1']

  for c in df.columns:
     if c not in columns:
        df = df.drop(c, axis=1)    

  j+=1
  print '{} seconds: completed {} rows'.format((dt.datetime.now() - start).seconds, j*chunksize)

  df.to_sql('data', disk_engine, if_exists='append')
  index_start = df.index[-1] + 1

Output:

0 seconds: completed 100000 rows
1 seconds: completed 200000 rows
3 seconds: completed 300000 rows
4 seconds: completed 400000 rows
6 seconds: completed 500000 rows
7 seconds: completed 600000 rows
8 seconds: completed 700000 rows
10 seconds: completed 800000 rows
11 seconds: completed 900000 rows
13 seconds: completed 1000000 rows
14 seconds: completed 1100000 rows
16 seconds: completed 1200000 rows
17 seconds: completed 1300000 rows
19 seconds: completed 1400000 rows
OperationalError: (sqlite3.OperationalError) attempt to write a readonly database [SQL: u'INSERT INTO 
data ("index") VALUES (?)'] [parameters: ((1300001L,), (1300002L,), (1300003L,), (1300004L,),
(1300005L,), (1300006L,), (1300007L,), (1300008L,)  ... displaying 10 of 100000 total bound parameter
sets ...  (1399999L,), (1400000L,))]
nonegiven72
  • 427
  • 8
  • 19
  • Can you replicate that on a differen pc? Maybe your harddrive is failing? – syntonym Jul 19 '16 at 12:36
  • 1
    Is some other process also using the C:\\databases\\test.db file while your program is running? – mowcow Jul 19 '16 at 12:38
  • no, nothing else should be using this file. I've tried creating new databases "test1.db", "test2.db" etc... and I've had this error as well. – nonegiven72 Jul 19 '16 at 12:58
  • 1
    I had the same error, in my case it was the OneDrive sync process that was blocking the file. – tmsss Nov 19 '18 at 11:19

0 Answers0