0

I have a large (200 meg) JSON file (actually a text file) that is apparently too large to decode in PHP using json_decode. There is apparently no native way in PHP to decode large files and you apparently need to use a 3rd party library which my host frowns upon.

Accordingly, I had the idea of converting the JSON file into a MYSQL database table.

Here is the format of the JSON. Can anyone suggest code to import this into a MYSQL database?

Here is my code:

//This works fine
$longstr = file_get_contents("json-largefile.txt"); 
//this works with small sample file but fails with large 200 mb file
$array = json_decode($str, true); 

Here is the format of the json/txt file after help from this question:

[{"index":"tolstoy","index_id":0,"type":"introduction","line_id":1,"book_name":"War and Peace", "speech_number":"","line_number":"","speaker":"","text_entry":"Introduction"},
{"index":"tolstoy","index_id":1,"type":"heading","line_id":2,"book_name":"War and Peace","speech_number":"","line_number":"","speaker":"","text_entry":"Tolstoy's World"}]

I have tried the approaches in this question that get quite involved without success. Is there a straightforward one shot way to do this that does not involve iterating line for line or converting to .csv?

Thanks for any suggestions.

user6631314
  • 1,751
  • 1
  • 13
  • 44
  • What do you mean "too large to decode"? PHP can do it if there's enough free memory, or it isn't configured with a limit that's too low. Your host might have really tight limits, which is often a problem both for uploading and processing. Can you use [NDJSON](http://ndjson.org) instead? These can be parsed one record at a time instead of having to commit to parsing the whole thing. – tadman Mar 21 '21 at 19:03
  • Most libraries are just php code, how can a host object to you using php code? – Nigel Ren Mar 21 '21 at 19:19
  • Where do I put the files? Just in the same directory as the rest of the files? I do have FTP access. – user6631314 Mar 21 '21 at 19:35
  • why use php for that anyway upload the file and start a LoAD iNFILE – nbk Mar 21 '21 at 19:35
  • Tadman, I am already at 512M which is maximum for my server plan. I'm not sure if I can use one record at a time, cause the main purpose of this large file is to conduct searches on it. – user6631314 Mar 21 '21 at 19:37
  • nbk you mean something like CREATE TABLE IF NOT EXISTS jsons_test( annotation_id INT AUTO_INCREMENT NOT NULL UNIQUE, json_data json); LOAD DATA LOCAL INFILE 'C:\\some\\path\\test.json' INTO table json_test(json_data); – user6631314 Mar 21 '21 at 19:39
  • yes, exactly php has timeouts that can be problamatic so trxx the drirect approach – nbk Mar 21 '21 at 19:43
  • Also make sure you set `max_allowed_packet` large enough. See https://dev.mysql.com/doc/refman/5.7/en/packet-too-large.html – Bill Karwin Mar 21 '21 at 22:58
  • When I run, getting error You have an error in your SQL syntax; check the manual that corresponds to your MySQL server version for the right syntax to use near 'json )' at line 1...I think this refers to the last json before ) in first line. I've tried removing json but then get an error unrecognized data type – user6631314 Mar 22 '21 at 13:59

0 Answers0