1

Ok, so I have a huge text file (around 2 gigabytes) and it has about 2 billion lines. All I've tried so far is

$myfile = fopen("C:\Users\server\Desktop\primes.txt", "r") or die("Unable to 
open file!");
while(!feof($myfile)) {
    echo fgets($myfile) . "<br>";
}
fclose($myfile);

but has a problem with not finishing all of the lines and hangs up somewhere in the first quarter - and it also takes a long time to load. Second thing I've tried was this

$path="C:/Users/server/Desktop/Server files/application.windows64/";
$file="primes.txt";

//read file contents
$content="
        <code>
            <pre>".file_get_contents("$path/$file")."</pre>
        </code>";

//display
echo $content;

But it wasn't even loading lines.

Also, I can't directly open this file and it has to be on my Desktop. Please don't suggest me to copy or move it into another directory.

Could I get any suggestions to help me get along or at least an explanation why it's not working?

Sorry, my English isn't as good as it should be.

MG lolenstine
  • 919
  • 1
  • 11
  • 32
  • Leverage the HTTP server, search for X-SendFile – Geoffrey Aug 04 '17 at 23:22
  • 1
    It has been 10 years since I've done any web server config, but I seem to recall PHP having a max file size limit. I have no idea whether this is the problem, it's just a very random guess. – stevieb Aug 04 '17 at 23:24
  • 1
    Loading that file would take several minutes anyway (or even hours), nobody today waits for a page to load more than 10 secs or so ... – Teemu Aug 04 '17 at 23:25
  • This may be helpful: https://stackoverflow.com/questions/4183658/php-upload-max-filesize – DYZ Aug 04 '17 at 23:28
  • Are you trying to render the file at an HTML `document`? – guest271314 Aug 04 '17 at 23:28
  • @guest271314 That's what `echo $content;` does? – Teemu Aug 04 '17 at 23:29
  • 2
    Probably 99.99999% of all users in the world will only get a browser crash with a 2GB file. – Aganju Aug 04 '17 at 23:30
  • @Teemu Then OP should be able to stream the file in chunks to an HTML `document`, instead of trying to read and `echo` the file at once. Allow user to select, for example, 1MB or less chunks of the file to read at a time. – guest271314 Aug 04 '17 at 23:30
  • @guest271314 No... don't chunk, this is the worst advice and causes a stupid amount of server load... leverage the HTTP server's ability to do this and offload it by means of XSendFile. – Geoffrey Aug 04 '17 at 23:34
  • @Geoffrey How so? Break the single file into manageable chunks at server and serve those chunks. If you have decided that your proposed solution is optimal, then post an Answer. – guest271314 Aug 04 '17 at 23:36
  • @guest271314 See: https://coderwall.com/p/8lpngg/x-sendfile-in-apache2. When chunking such a huge file you tie up a PHP process, and risk hitting the max execution time which will terminate your script before it completes. The HTTP server is DESIGNED to serve out static content from files, making PHP chunk the content out like this is just ridiculous. – Geoffrey Aug 04 '17 at 23:36
  • @Geoffrey Where does OP mention serving the file for user to download the file? OP appears to be trying to render the file at an HTML `document` – guest271314 Aug 04 '17 at 23:37
  • @guest271314 It doesn't matter if it is a 'download' or not, you can serve HTML content using the same method. – Geoffrey Aug 04 '17 at 23:38
  • @Geoffrey Since you have drawn the conclusion that your suggested approach is the only viable approach not sure why you have not posted an Answer to resolve the inquiry? – guest271314 Aug 04 '17 at 23:39
  • Is there a way for me to use HTML's iframe to show the file? – MG lolenstine Aug 04 '17 at 23:43
  • @MGlolenstine Why do you need to render the entire file at the same time? Can you not serve and render, for example less than 1MB chunks of the file at a time, depending on which portion of the time user requests? – guest271314 Aug 04 '17 at 23:43
  • True, I don't... But I'm trying to find a way for it to be possible... TBH I was thinking about making it chunked, and I'll probably do that... I was just wondering. Thanks to all of you! – MG lolenstine Aug 04 '17 at 23:46
  • 1
    You can make a form of slides of the HTML, similar to code that renders or appears to render books at browsers, i.e.g., `epub`. That is, serve less than 1MB chunks of the file at a time, when user clicks next or previous page, delete the current page from memory and render the new requested page. You could alternatively cache several pages at a time, to not have to request the previous and next page again. – guest271314 Aug 04 '17 at 23:49

1 Answers1

3
   $attachment_location = $_SERVER["DOCUMENT_ROOT"] . $file;
    if (file_exists($attachment_location)) {

        header($_SERVER["SERVER_PROTOCOL"] . " 200 OK");
        header("Cache-Control: public");
        header("Content-Type: text/plain");
        header("Content-Length:" . filesize($attachment_location));
        header("Content-Disposition: attachment; filename=file.txt");
        readfile($attachment_location);
        die();        
    } else {
        die("Error: File not found.");
    } 

It's not working because it's a 2 gig file and you're trying to output it to the screen as opposed to allowing the user to download it or open it on their personal machine. That should be the only way this file is delivered. 2 gigs output to a browser window would probably crash either or both of the client and server.

http://php.net/manual/en/function.readfile.php

If you really want to actually display the file, it would technically be possible to display a certain % of the file with a pager that when clicked will switch between different portions of the file.

http://php.net/manual/en/function.fseek.php

Kuba hasn't forgotten Monica
  • 95,931
  • 16
  • 151
  • 313
thomasmeadows
  • 1,835
  • 12
  • 15
  • delivers the file, and recommends downloading it. – thomasmeadows Aug 04 '17 at 23:30
  • This still relies on PHP to serve the data out, with such a huge file this not only holds up the PHP process on the server just to send a blob of static data, but also runs the risk of the max execution time being exceeded. Use XSendFile for stuff like this and let the HTTP server handle it. – Geoffrey Aug 04 '17 at 23:32
  • but he is trying to make it downloaded from a website. Title: **Showing off a HUGE Text file on a website** – thomasmeadows Aug 04 '17 at 23:38
  • Thanks for the answer, but I don't want to download it, I'd just like to show it... If there is no other way, I can even show it in more pages, but I want to know if there is a way to show the whole file on one page. – MG lolenstine Aug 04 '17 at 23:41