0

I'm trying to load a dump from a big postgresql database (>1GB) with sqlite3. This is my code:

# import package
import sqlite3
# load data
importpath = 'C:\directory\dump.sql'
con = sqlite3.connect(importpath)
f = open(importpath,'r')
sql = f.read()
cur.executescript(sql)

I keep getting a very unspecific error. It just says "MemoryError:" with an arrow pointing on "sql = f.read()". I guess it is some sort of problem with insufficient memory for all the data but I really can't tell. Any advice to fix that? I just need to get the data into a local database where I can access it from Python. Unfortunately it's not an option to use a separate local database application since I'm stuck with what is already installed.

mango
  • 17
  • 1
  • 6
  • 1
    SQLite is simply not the right tool for large databases. If it's from Postgres, why are you not *using* Postgres? – tripleee Jan 18 '19 at 08:58
  • Try to import dump with sqlite [command line interface](https://www.sqlite.org/cli.html). See https://stackoverflow.com/a/2049137/4265407 – Stanislav Ivanov Jan 18 '19 at 09:17

0 Answers0