1

I have a huge dump file I created from sqlite db and try to load it into postgres db.

Some of the rows are invalid, and I want to skip them. for example, there are objects with char field limited to 1000 chars and there are in fact rows with longer values. because of them django won't let me load the fixture data ("value too long for type character varying(1000)"

How do I tell django to just skip those invalid items and load all the rest?

thanks!

EDIT: preffered option - if I can truncate those values instead of skipping the object entirely it will be better.

Ronen Ness
  • 9,923
  • 4
  • 33
  • 50

1 Answers1

1

If you don't care about strings that are longer than 1000 chars the simplest way is to 'dump' the django dumpdata and export directly from sqlite. Then you can use the substr function in sqlite to truncate the unwanted data.

To export to a CSV file from sqlite type .mode csv from the sqlite shell followed by .out filename followed by the query. instead of calling SELECT * you need to call SELECT whatever, substr(whateverelse, 0, 1000,). for additional help on dumping see this question and answers.

Have dumped from sqlite, you can import into postgres using COPY FROM

Community
  • 1
  • 1
e4c5
  • 52,766
  • 11
  • 101
  • 134
  • hi e4c5, interesting approach. because I was in a hurry I ended up doing it the ugly way - removed the length limit, let everything load, run a script that clean rows that were too long, then added the length limit back on. thanks anyway :) – Ronen Ness Aug 27 '15 at 07:02