1

I have a 16GB file. I'm trying to make an array that splits by line down. Right now I'm using file_get_contents with preg_split, using the following.

$list = preg_split('/$\R?^/m', file_get_contents("file.txt));

However, I get the following error:

Fatal error: Allowed memory size of 10695475200 bytes exhausted (tried to allocate 268435456 bytes) in /var/www/html/mysite/script.php on line 35

I don't want to use too much memory. I know you can buffer it with fopen, but I'm not sure how to create an array using the contents of the file with a line down being the delimiter.

The question does not address how I would make an array from the contents of the file using preg_split similar to how I do above.

Josh Holly
  • 79
  • 1
  • 1
  • 6
  • 16GB is huge. You shouldn't use this approach at all. What is it you intend to do with the data? It would be better if you can insert it into a database and then do the actions / filtering / sorting / etc that you want to do. – Clay Oct 21 '16 at 04:53
  • The file is a list of URLs, the script is going to go through each URL of the whole file, and grab some information off of every URL. Client needs this done, and needs all the data before I get paid. – Josh Holly Oct 21 '16 at 04:55
  • Possible duplicate of [How to read a file line by line in php](http://stackoverflow.com/questions/13246597/how-to-read-a-file-line-by-line-in-php) – Clay Oct 21 '16 at 04:56
  • Check out the answer to that very similar question in my comment above. You want to basically read the file line by line instead of loading all the data at once. – Clay Oct 21 '16 at 04:56
  • How would I read the file line by line, and create a big array out of that? Because right now it's a foreach loop. See my post above, I'm very aware of output buffers, but I really need to create this big array. I need an alternative that will do exactly what my code does already (which works fine on small/medium files) to work on larger files. – Josh Holly Oct 21 '16 at 05:09
  • You can't, it is too large. If your computer has 16GB RAM or more maybe it is worth a try (or use the cloud maybe?) but probably incredibly slow. Rework your code, use a database or use the link I suggested, I'm honestly not sure what else to suggest. – Clay Oct 21 '16 at 05:41
  • you can config php ini file set ini_set('memory_limit','256M'); – pawan sen Oct 21 '16 at 06:12
  • So what's the best way to work with this big data to automatically work with the whole list? – Josh Holly Oct 21 '16 at 06:12
  • @pawansen I tried making the memory limit to 7 gigs, and it still didn't work. – Josh Holly Oct 21 '16 at 06:12
  • I can split the files and have it work with one file at a time. What's the max MB/GB of the file would you recommend to use with file_get_contents? – Josh Holly Oct 21 '16 at 06:15

0 Answers0