2

I am making a call to a process to generate PDFs on the backend using a REST request in Angular (I'm using restangular to create these).

My problem is that I want the user to be able to continue working while this process is working as the pdf generation can take a while. Just as a test, I have created a service that has a delay of 5 seconds before returning data to the client. I call it like this:

        var d = $q.defer();
        Restangular
            .all('delay')
            .getList()
            .then(function (data) {
                d.resolve(data);
            }, function () {
                d.reject();
            });
        return d.promise;

This works fine, but any other requests that I make before the call has finished get blocked. I would expect this to happen if I had chained the .then functions, but I am calling a completely separate REST request.

I have to wait for the service generating the PDFs to finish before returning to the client other wise the process won't complete. So do I have to wait or is it possible to run two or more REST requests simultaneously?

Thanks!

EDIT.... Nobody seems to be interested in this question! Anyway, I have been looking into the option of returning a request from php to allow the javascript to keep on running and this sort of half works. There are several version of this answer scattered around stackoverflow - this one's from here

    public function delay()
{
        ignore_user_abort(true);
        set_time_limit(0);
        ob_start();
        // do initial processing here
        echo 'Text the user will see';
        header('Connection: close');
        header('Content-Length: '.ob_get_length());
        ob_end_flush();
        ob_flush();
        flush();

        // Do processing here 
                    $d = time();            
                    sleep(5);
                    $t = time()-$d;            
        echo('Text user will never see');            
}

    public function runtime()
{
        $d = time();
        $t = time()-$d;            
        return $t;
}

In reality, what happens is that the php goes and sends the response back to the browser and then carries on processing as it should. However, the next request from the client still waits for the first one to finish processing before it is run.

The behavior I want is for the php to start a new thread as if I were calling the method from another browser (which of course I have tested and it works)

Community
  • 1
  • 1
Craig Morgan
  • 942
  • 2
  • 11
  • 27

1 Answers1

1

Another option is to have the server return a token for any job that is in process, then have the client make a second request later to check the status of the job using a timeout. Your client code can then continue to start new requests or do other work when not actively polling for a job's status.

Here's a very simplistic example:

CLIENT:

$.ajax({
  url: "startJob",
  success: function(response) {
    setTimeout(function(){ pollForJobStatus(response.jobToken) },3000);
  }
  });

function pollForJobStatus(jobToken) {
  $.ajax({
    url: "jobStatus",
    success: function(response) {
      if(response.job.status == 'done') {
          // DO SOMETHING WITH THE RESULTS
      }
      else {
        //keep polling
        setTimeout(function(){ pollForJobStatus(response.jobToken) },3000);
      }
    }

  });
}

SERVER:

*//  POST: '/startJob'*
function($data)
{
    // take incoming data from client,

    //start 'job' on a new thread
    $job = new PdfJob($data);
    $job->start();

    //record the threadID info in database
    DB.store($job->processId, $job->jobToken);

    //return token to client
    return new ApiResponse($job->jobToken);
}


    *// GET: '/jobStatus'*
function($token)
{
    // lookup job's process id in the db
    $pId = DB.retrieve($token);

    // use the processId to check the job status
    $jobStatus = checkJobStatus($pId);

    //return status to client
    return new ApiResponse($jobStatus);
}
TheJoe
  • 256
  • 1
  • 8