0

I have a pretty big index.php file(about 500kB) in which lots of logic and database query are present(and the index.php which goes to the client is about 200kB).
What I'd like to do is first of all compress the file using gzip, which I do by simply adding SendOutputFilter in my .htaccess file. Now, since the file is pretty big to process for the server, TTFB can take a while and therefore I'd like to send to the user the header of the file before even looking at the query so that the browser will discover images, css and js(which are also pretty big) and will start downloading instead of being idle waiting for the whole index.php being processed on the server (I'd like something like Google search does. It starts download pngs before the whole page is loaded(and the page is compressed)).
I made some query and I couldn't find any straightforward solution. All I found is that either I disable gzip and use flush or use gzip but not flush.
But as you can see in my case I need both, and I know this can be done somehow. Possibly with some workarounds.
This is how huge modern websites already do, so I'd like to know how.

Core_dumped
  • 1,571
  • 4
  • 16
  • 34
  • 1
    Looks to me like you are optimizing the wrong thing. If your `index.php` is 500kb with lots of logic, then that doesn't mean the resulting HTML content is also very big. All the logic is dropped. Are you sure the problem is that the resulting HTML is very big and GZIP is the solution? – Hugo Delsing Feb 03 '15 at 07:46
  • Thanks for the comment, I know the size of the raw index.php is big because of the logic but becomes smaller when sent to the user's browser(since we send only output and not php logic). I said 500kB to make a clear idea that there is pretty a bit amount of logic to be executed. Indeed when sent to tha browser the file is something like 200 kB. So yes, I need gzip nonetheless. – Core_dumped Feb 03 '15 at 07:48
  • Ok, and when you use GZIP it suddenly is like `50` or so? Like drastic improvement.Because on a normal connection these days 200kb is almost instant. But anyway, what you want cant be done. You might be able to start sending data "flushed", but the browser needs the complete content before it can unpack and thus start rendering and see what other files need to be loaded – Hugo Delsing Feb 03 '15 at 07:54
  • 1
    Look at Google search, that's exactly what it does. It compresses the html and at the same time sends the header of file as soon as the request arrives to their server (before even looking at the query), and if you look carefully in the(devTools) as soon as 10% of their html is downloaded other files like png start beign downloaded in parallel. How do you explain that? That's because the browser doesn't need the whole html to start processing it, as soon as the browser starts downloading parts of html it starts parsing it and constructing the DOM(in parallel) – Core_dumped Feb 03 '15 at 08:06
  • Not sure how your google works, but mine loads a fast `30KB` gzipped file in about `200ms` and starts to load everything else another `200ms` later. – Hugo Delsing Feb 03 '15 at 08:15
  • 2
    Try again, and look carefully at the .png files which start download before the whole html is downloaded – Core_dumped Feb 03 '15 at 08:17
  • I tried several web/image search in Firefox and Chrome and all show the files start to load after the initial file is completely downloaded and parsed – Hugo Delsing Feb 03 '15 at 08:53
  • I don't know, try reducing your speed to 50kbps in the dev tools and maybe you'll notice. Just type something in the search bar (like stackoverflow) and then reload fully the page disabling cache. – Core_dumped Feb 03 '15 at 08:55

3 Answers3

1

This doesn't address the issue of GZIP and flush but rather PHP script and page design tailored for your question about preloading css, html etc.

You may want to consider splitting your index.php workload between two scripts, first loading html for display purposes and then then requesting the "heavier" tasks asynchronously using ajax, subsequently updating portions of your screen. This will allow CSS and all the rest to do their work first followed by longer running tasks to display their results later.

To accomplish this start of with a "lightweight" index.php file with basic webpage html and display logic, with a event/trigger like $(window).load(function(){ //ajax call to heavier heavy_index.php script and screen updating from response }) which would allow the page to render completely and then once loaded call the heavier stuff.

This gives a quick example:

index.php

<!DOCTYPE html>
<html>
    <head>
        <link rel="stylesheet" href="/my_css.css">

        <script src="//ajax.googleapis.com/ajax/libs/jquery/1.11.2/jquery.min.js"></script>
        <script type="text/javascript">

            $(window).load(

                function () {
                    alert("About to load more content");

                    $.ajax({
                        url: "/heavy_index.php",
                        success: function (html_data) {
                            $("#content_loaded_later").html(html_data);
                        }
                    });
                });

        </script>
    </head>

    <body>
        <div class='content_initial'  ><span>Content initially loaded</span></div>
        <div class='content_later' id='content_loaded_later'>loading...</div>
    </body>
</html>

heavy_index.php

<?php
echo "resulting content from heavier workload";
?>

my_css.css

.content_initial
{
border:1px solid red; width:120px; height:120px;
margin:10px;
}

.content_later
{
border:1px solid green; width:120px; height:120px;
margin:10px;
}

You may also want to look at this post Preload CSS/Javascript without Execution

Hope this helps at all.

Chris du Preez
  • 539
  • 3
  • 8
1

Luckily I was wrong. Even though GZIP needs to be downloaded completely before you can unpack it, you dont have to send just one chunk. You can send several seperate chunks, where each one is encoded separately.

This means the browsers needs to download chunk one completely and then it can unpack it and start parsing the html. Meanwhile it is downloading chunk two.

Progressive rendering via multiple flushes is a nice article explaining how it works. It is however not PHP handled, but server/apache handled.

Check out How to make PHP generate Chunked response for the PHP part you need to do.

To make GZIP work is related to how your server is setup, for help your best bet would be serverfault

Community
  • 1
  • 1
Hugo Delsing
  • 13,803
  • 5
  • 45
  • 72
0

As far as I know, there is no way within Apache to force early output of content to the browser. However it is possible to do so from PHP. Note that in PHP output buffers can be layered, hence you may need to....

while (ob_get_level()) ob_end_flush();

This will send the data back to Apache without closing stdout. In the absence of other complications, that will trigger a chunked response to the browser. But the mod_deflate output filter also buffers data - DeflateBufferSize - 8kb by default. If your <head> (NOT YOUR HEADER!) is more than this size it will sit in the buffer until it is pushed out by more content. You can reduce the size of the buffer and you can pad your content to fill it - in practice you should be using both methods.

Since other people have said that this is impossible (it is not - try it) and described using Ajax to load the page, you might want to take a look at PJAX. There are big adavantages to using this on very javascript heavy site.

symcbean
  • 47,736
  • 6
  • 59
  • 94