4

I am having following problem:

  1. I am running BIG memory process but have divided memory load into smaller chunks so no CPU time out issue.
  2. In the Server I am creating .xml files with around 100kb sizes and they will be created around 100+.
  3. Now main problem is browser shows Response Time out and IE at the below (just upper status bar) shows .php file download message.
  4. During this in the backend (Server side) process is still running and continuously creating .xml files in incremental order. So no issue with that.

I have following php.ini configuration.

max_execution_time = 10000     ; Maximum execution time of each script, in seconds
max_input_time = 10000 ; Maximum amount of time each script may spend parsing request data
memory_limit = 2000M      ; Maximum amount of memory a script may consume (128MB)
; Maximum allowed size for uploaded files.
upload_max_filesize = 2000M

I am running my site on IE. And I am using ZSCE with PHP 5.3

Can anybody redirect me on proper way on this issue?

Edit:

Uploading image of Time out and that's why asking for .php file download.

enter image description here


Edit 2:

I briefly explain my execution flow:

  1. I have one PHP file with objects of Class Hierarchies which will start to execute Function1() from each class Hierarchy.
  2. I have class file.
  3. First, let say, Function1() is executed which contains logic of creating XML files in chunks.
  4. Second, let say, Function2() is executed which will display output generated by Function1().

All is done in Class Hierarchies manner. So I can't terminate, in between, execution of Function1() until it get executed. And after that Function2() will be called.

Edit 3:

This is specially for @hakre.

As you asked some cross questions and I agree with some points but let me describe more in detail about the issue.

  1. First I was loading around 100+ MB size XML Files at a time and that's why my Memory in local setup was hanging and stops everything on Machine and CPU time was utilizing its most resources.

  2. I, then, divided this big size XML files in to small size (means now I am loading single XML file at a time and then unloading it after its usage). This saved me from Memory overload and CPU issue on local setup.

  3. Now my backend process is running no CPU or Memory issue but issue is with Browser Timeout. I even tried cURL but as per my current structure it does seems to fit because of my class hierarchy issue. I have a set of classes in hierarchy and they all execute first their Process functions and then they all execute their Output functions. So unless and until Process functions get executed the Output functions do not comes in picture and that's why Browser shows Timeout.

  4. I even followed instructions suggested by @vortex and got little success but not what I am looking for. Why I could not implement cURl because My process function is Creating required XML files at one go so it's taking too much time to output to Browser. As Process function is taking that much time no output is possible to assign to client unless and until it get completed.

cURL Output:

URL....: myurl 

Code...: 200 (0 redirect(s) in 0 secs) 

Content: text/html Size: -1 (Own: 433) Filetime: -1 

Time...: 60.437 Start @ 60.437 (DNS: 0 Connect: 0.016 Request: 0.016) 

Speed..: Down: 7 (avg.) Up: 0 (avg.) 

Curl...: v7.20.0 

Contents of test.txt file

* About to connect() to mylocalhost port 80 (#0)

*   Trying 127.0.0.1... * connected

* Connected to mylocalhost (127.0.0.1) port 80 (#0)

\> GET myurl HTTP/1.1
Host: mylocalhost
Accept: */*

< HTTP/1.1 200 OK

< Date: Tue, 06 Aug 2013 10:01:36 GMT

< Server: Apache/2.2.21 (Win32) mod_ssl/2.2.21 OpenSSL/0.9.8o

< X-Powered-By: PHP/5.3.9-ZS5.6.0 ZendServer

< Set-Cookie: ZDEDebuggerPresent=php,phtml,php3; path=/

< Cache-Control: private

< Transfer-Encoding: chunked

< Content-Type: text/html

< 
* Connection #0 to host mylocalhost left intact

* Closing connection #0

Disclaimer : An answer for this question is chosen based on the first little success based on answer selected. The solution from @Hakre is also feasible when this type of question is occurred. But right now no answer fixed my question but little bit. Hakre's answer is also more detail in case of person finding for more details about this type of issues.

Community
  • 1
  • 1
Smile
  • 2,770
  • 4
  • 35
  • 57
  • Why not show the user a message like "Your data is being processed, come back in a few minutes." This way, the user has a direct feedback and does not have to wait for the page to render. – Lars Ebert Jul 25 '13 at 07:42
  • @LarsEbert, I can't do that because displaying result on same page. So when I click on an Icon for this page request it goes for process and will return back with output (which needs to be displayed on page) – Smile Jul 25 '13 at 08:20
  • I haven't tested this, but what happens if you emit an XML header straight away, then start doing periodic white space? The header should allow the client to receive the rest of the XML document in expected fashion. – Owen Beresford Aug 06 '13 at 06:27
  • @OwenBeresford, Can you explain again? – Smile Aug 06 '13 at 06:35
  • You can't solve this problem without putting a queue mechanism in place. First off, having a browser waiting for response (thus keeping the process running) is bad because the client can hang up at any time. The client has timeout during which it must receive a response or it automatically hangs up (which is what happens to you). The only way to solve this is to queue a job and once the job is done (job = file creation), you notify the user (or create a page where finished files are displayed). If you want to pursue your idea - well, it simply won't work. – N.B. Aug 06 '13 at 10:24
  • as pointed by @Андрей Почекуев the problem actually is in the web server, which closes connection before the script ends (you may see something like "Premature end of script headers" in the server log. Now the tricky part is that you can run PHP with Apache in many different ways - using mod_php, fastcgi or something else. So please post how you are running PHP and add the relevant part of the apache/respective module config. There are multiple timeout settings that can take part in the problem – Maxim Krizhanovsky Aug 06 '13 at 14:56
  • @nullvoid, if you write a HTTP content type header the browser will know that its getting XML. Setting the PHP execution time to infinite and the blocks of spaces should ensure the webserver keeps waiting for the XML. Your client will need to be configured to ignore empty whitespace between XML islands. Send the data as you eventually have it. Sorry for the delay, was on holiday. – Owen Beresford Aug 12 '13 at 21:24
  • @OwenBeresford, I appreciate your response friend. But as per my current structure I cannot send any data/xml to Browser unless and until I complete an entire process and this is a main cause of problem. I tried with infinite time and other things but could not succeeded yet. I will sure contact your in case if I want any further assistant ;) – Smile Aug 13 '13 at 04:19

8 Answers8

10

assuming you made all the server side modifications so you dodge a server timeout [i saw pretty much everyting explained above], in order to dodge browser timeout it is crucial that you do something like this

<?php
set_time_limit(0);
error_reporting(E_ALL);
ob_implicit_flush(TRUE);
ob_end_flush();

I can tell you from experience that internet explorer doesn't have any issues as long as you output some content to it every now and then. I run a 30gb database update everyday [that takes around 2-4 hours] and opera seems to be the only browser that ignores the content output. if you don't set "ob_implicit_flush" you need to do an "ob_flush()" after every piece of content.

References

  1. ob_implicit_flush

  2. ob_flush

if you don't use ob_implicit_flush at the top of your script as I wrote earlier, you need to do something like:

<?php
echo 'dummy text or execution stats';
ob_flush();

within your execution loop

vortex
  • 862
  • 8
  • 14
  • Where to put last two functions? – Smile Aug 05 '13 at 08:19
  • Have you noticed my Edit2 section? where I have explained the execution flow. Because unless and until `Function1()` is not executed no display it possible. Because to display output `Function2()` will be executed after each class hierarchies `Function1()` get executed. – Smile Aug 05 '13 at 08:36
  • I am suggesting you split these functions and add debug logging to each important step. if you don't do that it will be harder to debug if your settings worked or not. let me be clear here: let's say you have an echo every 5000 lines, if your script still fails then the error is somewhere else. if your settings are correct you can make a delay of 30mins even between your function and output function and it will still work – vortex Aug 05 '13 at 08:41
  • I checked your given suggestions but still "Browser Time Out" whereas background process for Pages creation is running. – Smile Aug 05 '13 at 08:50
  • please read [php connection handling](http://php.net/manual/en/features.connection-handling.php) and do as M Shahzad Khan and I suggested, split your execution into smaller chunks and add debugging output to it. you really can't get an answer if you don't output your code or don't follow the suggestions. – vortex Aug 05 '13 at 08:58
  • I am following your suggestions and had followed others suggestion. Also outputting some output at specific point but still browser time out occurs. – Smile Aug 05 '13 at 09:00
  • can you do a phpinfo(); in your script and check if your values are correct ? there might still be a configuration error – vortex Aug 05 '13 at 10:10
  • What do you want me to see in phpinfo() output? Means what should I look and return back it to here? – Smile Aug 05 '13 at 10:25
  • @NullVoid you should check that the modifications you entered in your php.ini or apache are reflected to your phpinfo. also check for additionally loaded configurations – vortex Aug 05 '13 at 15:01
  • Well, don't express by code unless as well this is tested. E.g. ask for the current documentation first, then do assumptions about the configuration. Not try to enforce some configuration you don't even know if it is needed or not. Take care that asking a bad worded question on SO does invite other users to give you bad answers, so this is partially your fault because your question is not of good quality (in the meaning merely that it's not specific). Placing a bounty can enforce this even more. Just saying. – hakre Aug 06 '13 at 09:00
  • Vortex and @hakre. I am using no Database related stuff in this question. Only Reading XML file and creating XML file from that XML file. Now about outputting something to Browser, as I told that according to my current structure (as per my knowledge) I am not able to send any data to Browser and until my Function1() get completed. And I should admit that by Vortex suggestion I could get little success but not 100%. I now found that based on your (vortex) and Hakre's answer that something I should have to feed to Browser when I process some big process. – Smile Aug 07 '13 at 08:44
  • As I received little success from your solutions "echo and flush()" combination and that's why I am accepting your answer. – Smile Aug 07 '13 at 09:02
  • thanks a lot, you should still post updates here as I didn't answer for the bounty. hakre's answers deserve at least an upvote and hopefully you will solve your problem in the near future, provided you post your updates here – vortex Aug 07 '13 at 13:53
4

1. I am running BIG memory process but have divided memory load into smaller chunks so no CPU time out issue.

Now that's a wild guess. How did you find out it was a CPU time out issue in the first place? Did you even? If yes, what does your test now gives? If not, how do you test now that this is not a time-out issue?

Despite you state there won't be a certain issue, you don't proof that and many questions are still open. That invites for guessing which is counter-productive for trouble-shooting (which you are doing here).

What you write here just means that you wrote code to chunk memory, however, this is not a test for CPU time out issues. The one is writing code the other part is test. Don't mix the two. And don't draw wild assumptions. Issues are for the test, otherwise it didn't happen.

So much for your first point already just to show you that when doing troubleshooting, look for facts (monitor, test, profile, step-debug) not run assumptions. This is curcial otherwise you look in the wrong places and ask the wrong questions.


From what you describe how the client (browser) behaves, this is not a time-out-issue per-se. The problem you've got is that the answer between the header response and the body response is taking to long for the taste of your browser. The one browser is assuming a time-out (as such a boundary value has been triggered and this looks more correct to me) and the other browser is assuming somthing is coming up, why not save it.

So you merely have a processing issue here. Please consult the menual of your internet browsers (HTTP clients) which configuration values you can change to change this behavior. E.g. monitor with a curl-request on the command-line how long the request actually take. Then configure your browser to not time-out when connecting to that server under such an amount of time you just measured. For example if you're using Internet Explorer: http://www.ehow.com/how_6186601_change-internet-timeout-options.html or if you're using Mozilla Firefox: http://forums.mozillazine.org/viewtopic.php?f=7&t=102322&start=0

As you didn't show any code on the server-side I assume you want to solve this problem with client settings. Curl will help you to measure the number of seconds such a request takes. Use the -v (Verbose) switch to obtain detailed information about the request.

In case you don't want to solve this on the client, curl will still help you to measure important data and easily reproduce any underlying server-related timing issue. So you should go for Curl on the command-line in any case, especially as looking into response-headers might reveal what triggers the (again) esoteric internet explorer behavior. Again the -v switch does reveal you request and response headers.

If you like to automate such tests with a PHP script, it's also possible with the PHP Curl Extension. This has been outlined in:

Community
  • 1
  • 1
hakre
  • 193,403
  • 52
  • 435
  • 836
  • I have added more details please check **Edit 3**. – Smile Aug 06 '13 at 08:52
  • I didn't mean that you should implement curl. I was merely speaking about *using curl on the command-line* so that you can find out how long that HTTP request actually takes and what the response headers and body are. So most of what you've added is just "blah blah blah blah blah" :). Warm words, we don't see how much stress you can put on them. Instead start to run curl on the commandline, show the request and response headers and if there is a response body. Also tell us how long the request takes. Commandline now. Show the command. Show the output. Use the `-v` switch. – hakre Aug 06 '13 at 08:57
  • I very much appreciate your instructions but I cannot implement cURL as my structure is different. The way I print the output does not fits with cURL. – Smile Aug 06 '13 at 09:41
  • I have added Trace or cURL on my local. Please have a look ;) – Smile Aug 06 '13 at 10:13
  • Okay, that's what I would consider some more information indeed :). The curl log reveals that you're not specifying a [content-length](http://tools.ietf.org/html/rfc2616#section-14.13) and even the response is chunked, there are no chunks. This both is not a problem per-se (see http://tools.ietf.org/html/rfc2616#section-4.4), however I assume you want to return some response body, right? What should the response body be? Not giving content-length having the chunked transfer encoding and then closing the connection limits your response body here as you transfer no chunks from your server. – hakre Aug 06 '13 at 13:51
  • I'd also say, you need to show your code. Try to reduce the problem to a little script so this can be easily reviewed and the script still times out. Have all the changes in the code already if you changed it (e.g. `set_time_limit(0);`). You probably made some *easy to spot in code* mistake. For debugging next to writing a file you can as well output debug information to the browser, e.g. a number which file you're currently processing. That will ensure that PHP knows there still is something to output, will send something to the browser and the browser waits. – hakre Aug 06 '13 at 14:05
  • you could always change [firefox keep alive timeout](http://kb.mozillazine.org/Network.http.keep-alive.timeout) or [ie keep alive timeout](http://support.microsoft.com/kb/813827) but as far as I see it, unless you have an error in your code, not sending out statistical data every now and then, not modifying the browser keep alive will refrain you from solving your problem. right now, running your script server side, creating a database log for your completed actions and creating another page that polls/ajax the database for new completed items seems to be the best way to "fix" your problem. – vortex Aug 06 '13 at 15:04
  • @vortex: The database is not a queue, don't use it as such. File-system is likely better for that job if not a dedicated queue service. Right now it's not clear to me if there should be some output or not. Also in which format, e.g. outputting HTML comments might do the work in the end, no clue if that's appropriate here. – hakre Aug 06 '13 at 15:07
  • I think you misunderstood :D I am not aware of any queueing system he might need but a json log would be useful in my opinion. I can't see the benefits of using file system-based logging in the era of key-value stores. nonetheless I am not that familiar with big queueing systems so I might have missed your scope. for that I can only be sorry but I think you should detail your method so that we finally get an accepted answer to this question ! as I stated earlier, refusal to circumvent limitations requires a script rework – vortex Aug 06 '13 at 15:16
  • @vortex: JSON log? log where into? JSON is not very well for sequential data as it's a finished structure. So I don't understand indeed what you mean. And the era of key-value stores is what area in specific? The 60s? – hakre Aug 06 '13 at 15:38
  • @hakre I am far from the 60's so I have absolutely no idea if nosql databases were around that time. as for JSON, having a key-value store database that can output json directly would be faster for polling or ajax retrieval. a really brute structure as file, date [ when the file generations started and finished] would be a good start. regardless of format, a non relational database would be of best use here since you can add any additional information you need to the logs on the fly – vortex Aug 06 '13 at 15:47
  • @vortex: The concepts of key value stores are back from the 60s. Also on of the principles you should know about: *The file-system is a database*. – hakre Aug 06 '13 at 16:06
  • @vortex and Hakre, I very much appreciate suggestions from both of you and your kind attention. I should accept that I should have to send at least some output to Browser when my process is running for long. I should check with my current structure and follow your suggestion so that Browser Time out should not reach. As there is time over for Bounty. I first got little success from Vortex answer so I am accepting his answer. But still not solved my issue (with respect that I should have to check with my current implemented structure) – Smile Aug 07 '13 at 08:50
  • Hakre, Again I am appreciating your suggestions about implementation of cURL. I did it but right now (as per my structure) it seems using cURL or other outputting to Browser (in between long running process) is not feasible. So I have to go for other solution. But sure I will try with cURL if it is possible and will let you know when I overcome fron issue. – Smile Aug 07 '13 at 08:54
  • You misread me. I only suggested curl for the touble-shooting as some kind of analysis tool, not to use it in the production code. – hakre Aug 07 '13 at 21:35
1

The problem is with your web-server, not the browser.

If you're using Apache, you need to adjust your Timeout value at httpd.conf or virtual hosts config.

Eternal1
  • 5,447
  • 3
  • 30
  • 45
1

You have 3 pages

  1. Process - Creates the XML files and then updates a database value saying that the process is done

  2. A PHP page that returns {true} or {false} based on the status of the process completion database value

  3. An ajax front end, polling page 2 every few seconds to check weather the process is done or not

Long Polling

Ashneil Roy
  • 154
  • 5
0

Is it possible to send some output to browser from the script while it's still processing, even white space? If, then do it, it should reset the timeout counter.

If it's not possible, you have to increase the timeout of IE in the registry:

HKEY_CURRENT_USER\SOFTWARE\Microsoft\Windows\CurrentVersion\Internet Settings

You need ReceiveTimeout, if it's not there, create it as dword, and set the value in miliseconds.

Marek
  • 7,337
  • 1
  • 22
  • 33
  • No. It's not possible to send any output until and unless process it finishes because I am displaying result on same page. – Smile Jul 25 '13 at 08:18
  • You are creating xml file, so it should be possible to output white space. Don't forget to flush() after each output. – Marek Jul 31 '13 at 12:11
  • I briefly explain my execution flow: 1. I have class file. 2. First, let say, Function1() is executed which contains logic of creating XML files in chunks. 3. Second, let say, Function2() is executed which will display output generated by Function1(). All is done in Class Hierarchies manner. – Smile Jul 31 '13 at 12:39
  • I have edited more details to my question. Please have a look for Flow of Execution. – Smile Jul 31 '13 at 12:48
  • between 2. and 3: echo " "; flush(); – Marek Jul 31 '13 at 13:07
  • Ok. But problem is that 3. `Function1()` is taking a long time and cause Browser Timeout. It does not gives a chance to execute `flush();` what I would added after execution of `Function1()` – Smile Jul 31 '13 at 13:21
  • Ok. I will try it :) But Function1() has loop which is executed so long that's why control does not comes after it. – Smile Jul 31 '13 at 13:33
  • I understand. You just give the browser something to chew on :) – Marek Jul 31 '13 at 13:48
0

I have had this issue several times, while reading large csv file and puting it in database. I solved it in way, that i divided the reading and putting in database process into smaller parts. Like i created a new table to make log of how much data is readed and inserted, and next time the page reloads itself and start from that position. So you can do it by creating one xml in one attempt,and reload page and start form next one. In this way the memory used by browser is refreshed. Hope it will help.

M Shahzad Khan
  • 935
  • 1
  • 9
  • 22
  • I am doing the same like dividing process into small parts but there is a one Big loop which iterates through .xml files and create another .xml files with small sizes. – Smile Jul 25 '13 at 08:17
  • I have edited more details to my question. Please have a look for Flow of Execution. – Smile Jul 31 '13 at 12:48
0

What is a "CPU time out issue"?

The right way to solve the problem is to run the heavy stuff asynchronously, in a seperate session group (not the webserver process tree).

symcbean
  • 47,736
  • 6
  • 59
  • 94
0

Try to include set_time_limit(0); in your PHP script page.

The following links might help you.

http://php.net/manual/en/function.set-time-limit.php

http://php.net/manual/en/function.ignore-user-abort.php