I have a very large, 4.4 million rows, tab delimited file with 51 columns of data, approx. 1.5GB.
When I try to use the SQLite command .import
it always tells me the last row only has 24 columns. I assume a transaction rolls back as there is no data.
If I just try and import the last 10 rows they are inserted without any error so the column count can't be wrong.
SQLite performance for large files doesn't seem to be a problem.
Is there a limit to the size of the import?