I am trying to find an algorithm better than this function in PHP because this is making memory limit explode while running on 1000000000000 characters length string
substr_count($string, $needle, $offset, $length)
Any help will be appreciated.
I am trying to find an algorithm better than this function in PHP because this is making memory limit explode while running on 1000000000000 characters length string
substr_count($string, $needle, $offset, $length)
Any help will be appreciated.
If you have such a huge file, you will indeed run into problems.
I think you should make your own code using the filehandle, and read in in chunks.
Have a look at fopen() and fread():
Open a filehandle to your huge file with fopen, then read in chunks of data with fread. You must provide the logic yourself to handle the case where the string you are looking for is broken between the chuncks of data.
That way you will NOT have the whole file in memory, but just the parts you get in with fread().