2

I have coded a php webpage. There is a while loop in the page which retrieves data from database and do certain logic with data and update the same data in database. This loop takes more than 4 minutes to complete this whole processing of loop. and then the remaining page code is processed.

Now at some time while the page is loading for while loop process and it is not completed yet, If I press Esc button, which means, I force stop the page from more processing. Now if I suddenly refresh the page , It takes very long to load any page on this same website. I assume that the remaining part of the loop is in progress of completion when I refresh the page suddenly.

I want this page to reload the page normally and doesn't depend upon the previously stopped action.

Is there any solution to this problem in your mind ? I also tried it via Ajax, but same problem persists.

Suman Ansari
  • 195
  • 11
  • The behavior you describe is very strange and should not happen under normal circumstances. Is there an issue with resource locking? ie. the script you cancelled is still running and locked a file the new script is trying to open? Or maybe you're writing to / reading from the same (locked) database table? – Halcyon Feb 25 '14 at 22:51
  • @FritsvanCampen It's not strange at all, if you're doing some intense database queries... or anything really that takes 4 minutes to complete. – Brad Feb 25 '14 at 22:51
  • (1) make sure `ignore_user_abort` is off, and spam whitespace in the loop (shouldn't make a difference it it's HTML, but the webserver needs to realize there's been a disconnect, but if that doesn't work (2) don't do big jobs in a webserver: make it an asynchronous job somewhere else, like for instance with `gearman`. This is assuming the database itself isn't the problem. – Wrikken Feb 25 '14 at 22:52
  • @Brad Not really, this is why we have threads. I suppose if you have a really inefficient script you could see this behavior - inefficient in terms of resource usage, like files and database. – Halcyon Feb 25 '14 at 22:52
  • why don't you use ajax – Karim Lahlou Feb 25 '14 at 22:53
  • @Wrikken The web server will wait for you indefinitely as long as the remote client doesn't connect. The remote client will wait a long time (hours) as long as the server has accepted the initial request. The user may not wait this long however. – Brad Feb 25 '14 at 22:54
  • @FritsvanCampen PHP isn't multithreaded. Even if it were, he's likely bound somewhere. Threading isn't the answer to every problem. – Brad Feb 25 '14 at 22:54
  • @KarimLahlou AJAX doesn't fundamentally solve the problem. – Brad Feb 25 '14 at 22:55
  • @KarimLahlou I have tried. AJAX also persists with same problem. – Suman Ansari Feb 25 '14 at 22:57
  • @brad: I'm talking about the 'cancelled long running job': no more client, but as long as the process itself does not send data, it can take a long while for the webserver to realize the connection isn't there anymore and cancel processing. However, I'm certainly more in favor of option (2). – Wrikken Feb 25 '14 at 22:58
  • @Wrikken Agreed, if data isn't being sent then the TCP connection sometimes won't know that it is stuck open. – Brad Feb 25 '14 at 23:01
  • @Wrikken Let me test the 2nd option suggested by you. I will try gearman. I will get back to you tomorrow. If in between you get an idea, Please let me know. Thanks – Suman Ansari Feb 25 '14 at 23:06

1 Answers1

1

What you need to do is spawn off an external process on request that does the work, then reports back the results.

I typically do this with a table in my database containing a job queue. When a job request comes in, it lands in the database. There is either a cron job checking for new work every minute, or I fire up code to run jobs if there are none already in progress.

The status of the work (percent complete and what not) are updated in the table. When the user on the site requests the result, they can either see the status, or see the data when it is complete.

Brad
  • 159,648
  • 54
  • 349
  • 530
  • will you please explain this a little more " spawn off external process on request and then reports back the results " Thanks for your so precious time – Suman Ansari Feb 25 '14 at 22:57
  • and also can you please tell me how to know that no cron job is running and run this code on cron job at that time ? – Suman Ansari Feb 25 '14 at 22:58
  • @SumanAnsari If you go the cron job method, the cron job should check the database table to see if an existing job is in progress. You should also save the PID of the process doing the work so your cron job can check if that process is still running. (For instance, your process may start a job, and crash, never being able to update the table indicating that it's done.) – Brad Feb 25 '14 at 22:59
  • @SumanAnsari If you don't use a cron job, see this question: http://stackoverflow.com/a/45966/362536 – Brad Feb 25 '14 at 23:00
  • @SumanAnsari Finally, I should also note that there are entire queuing systems available off-the-shelf. If you end up needing anything terribly complex, you might use one of them. I haven't needed anything beyond the basics, and can't offer a specific recommendation. – Brad Feb 25 '14 at 23:01
  • where are these queuing systems ? and what is your best thought solution for this process. Any kind. which you think fit best – Suman Ansari Feb 25 '14 at 23:04
  • @SumanAnsari http://stackoverflow.com/q/5672545/362536 – Brad Feb 25 '14 at 23:04
  • I will test the method in this link and will get back to you tomorrow. bundle of thanks – Suman Ansari Feb 25 '14 at 23:08
  • 1
    Queuing systems: make you own, or use `gearman`, `beanstalkd`, `RabbitMQ`, and others. If you roll your own: beware of the _"Poison Message"_: if a workload causes fatal errors for some reason, all processing of further jobs may stop, unless you make sure there you either (1) configure a maximum number of attempts, (2) spawn _every_ job in a different process or (3) shuffle the order of jobs around (which will still halt processing on the Poison Message, but the chances are a significant proportion of the jobs can be done). – Wrikken Feb 25 '14 at 23:12