Best Way to Manage Long-running PHP Script

Best way to manage long-running php script?

Certainly it can be done with PHP, however you should NOT do this as a background task - the new process has to be dissociated from the process group where it is initiated.

Since people keep giving the same wrong answer to this FAQ, I've written a fuller answer here:

http://symcbean.blogspot.com/2010/02/php-and-long-running-processes.html

From the comments:

The short version is shell_exec('echo /usr/bin/php -q longThing.php | at now'); but the reasons "why", are a bit long for inclusion here.

Update +12 years

While this is still a good way to invoke a long running bit of code, it is good for security to limit or even disable the ability of PHP in the webserver to launch other executables. And since this decouples the behaviour of the log running thing from that which started it, in many cases it may be more appropriate to use a daemon or a cron job.

How to restructure a long running php process to not time out

The below code indicates the basic logic to follow. It isn't tested code and should not be taken as a drop in code example.

Use a javascript loop

Instead of making a slow process slower - write your JavaScript to ask for smaller chunks of data in a loop.

I.e. the js could use a while loop:

$(document).ready(function(){
var done = false,
offset = 0,
limit = 20;

while (!done) {
var url = "read_images.php?offset=" + offset + "&limit=" + limit;

$.ajax({
async: false,
url: url
}).done(function(response) {

if (response.processed !== limit) {
// asked to process 20, only processed <=19 - there aren't any more
done = true;
}

offset += response.processed;
$("#mybox").html("Processed total of " + offset + " records");

}).fail(function(jqXHR, textStatus) {

$("#mybox").html("Error after processing " + offset + " records. Error: " textStatus);

done = true;
});
}

});

Note that in the above example the ajax call is forced to be syncronous. Normally you don't want to do this, but in this example makes it easier to write, and possibly easier to understand.

Do a fixed amount of work per php request

The php code also needs modifying to expect and use the get arguments being passed:

$stuff = scandir('../images/original');

$offset = $_GET['offset'];
$limit = $_GET['limit'];

$server_images = array_slice($stuff, $offset, $limit);

foreach($server_images as $server_image) {
...
}
...

$response = array(
'processed' => count($server_images),
'message' => 'All is right with the world'
);

header('Content-Type: application/json');
echo json_encode($response);
die;

In this way the amount of work a given php request needs to process is fixed, as the overall amount of data to process grows (assuming the number of files in the directory doesn't grow to impractical numbers).

PHP Long Running Script timeout

You should rather let the page load and e.g. run an AJAX request that will wait for a reply/listen on a port than trying to keep the connection alive.

So on the user's side, it would run an ajax request (javascript) to the php url, then on success you display the result.

$.ajax({
url: "/thescript.php":,
type: "POST",
datatype: "POST"
success: function(){
//do display stuff
}

});

Would probably add a reasonable timeout.

allow PHP script with long execution time to send updates back to the browser

You are encountering what is know as Session Locking. So basically PHP will not accept another request with session_start() until the first request has finished.

The immediate fix to your issue is to remove session_start(); from line #1 completely because I can see that you do not need it.


Now, for your question about showing a percentage on-screen:

analysis.php (modified)

<?php
ob_start();
ini_set('max_execution_time', 180);

$breaks = [ 1000, 2000, 4000, 6000, 8000, 10000, 20000, 50000, 99999999 ];
$breaks_length = count($breaks);
$p = 0;

foreach ( $breaks as $b ) {

$p++;

session_start();
$_SESSION['percentage_complete'] = number_format($p / $breaks_length,2) . "%";
session_write_close();

$sql = "query that takes about 20 seconds to run each loop of $b....";

$query = odbc_exec($conn, $sql);
while(odbc_fetch_row($query)){
$count = odbc_result($query, 'count');
}

$w[] = $count;

/* tried this... doesn't work as it screws up the AJAX handler success which expects JSON
echo $percentage_complete;
ob_end_flush();
*/
}

echo json_encode($w);

check_analysis_status.php get your percentage with this file

<?php
session_start();
echo (isset($_SESSION['percentage_complete']) ? $_SESSION['percentage_complete'] : '0%');
session_write_close();

Once your AJAX makes a call to analysis.php then just call this piece of JS:

// every half second call check_analysis_status.php and get the percentage
var percentage_checker = setInterval(function(){
$.ajax({
url: 'check_analysis_status.php',
success:function(percentage){
$('#percentage_div').html(percentage);

// Once we've hit 100% then we don't need this no more
if(percentage === '100%'){
clearInterval(percentage_checker);
}
}
});
}, 500);

Best way to handle long-running tasks on a web server

This sounds like something you could be better off doing from the server-side with Cron, for example and not from a browser. Unless, of course, you need to do this manually at random times by people who have no terminal access.

If it has to be PHP, you can run that with cron or from terminal and disable the timeout (see: http://php.net/manual/en/function.set-time-limit.php ) and even leave it as a background process if needed with &. This way you wouldn't be limited by Apache (or whatever server you are using) time limits.

How to execute a long task in PHP without stopping everything else

Sounds like you are using sessions.

It is not possible to run multiple php scripts in parallel for a single session, because this would lead to unexpected behaviour when using session variables.

More infos about this topic:

  • http://konrness.com/php5/how-to-prevent-blocking-php-requests/
  • PHP session_start serializes and blocks all others sharing the same session

Long running php scripts

After you fixed the session locking you may be looking for something to prevent the script being killed after the client disconnects (closes the tab or cancels loading):

How do I stop PHP from sending data to client while still running PHP code in server?

Best way to communicate the status of long running PHP-script to other PHP-Script for SSE

What I usually do is create a database table that holds records representing the file uploads, and return references to those records (I like to use a GUID token and not just the auto-incrementing ID) to the client upon the initial file upload requests. The client then polls a service that accepts the identifier as a parameter, verifies that the record belongs to the user, then returns the details. The process that handles the file import updates the record as needed. Once the record indicates completion or failure, the client can redirect to wherever they should go after the import completes or fails. Update the "notes" field with diagnostic details, exceptions, etc. on failure.

CREATE TABLE `file_imports` (
`id` int(11) unsigned NOT NULL AUTO_INCREMENT,
`user_id` int(11) unsigned NOT NULL,
`file_name` varchar(128) NOT NULL,
`status` enum('Pending','Processing','Success','Failed') NOT NULL DEFAULT 'Pending',
`notes` longtext,
`create_date` datetime NOT NULL,
`complete_date` datetime DEFAULT NULL,
`token` varchar(80) NOT NULL,
PRIMARY KEY (`id`),
UNIQUE KEY (`token`),
KEY `file_imports_user_fk` (`user_id`),
CONSTRAINT `file_imports_user_fk` FOREIGN KEY (`user_id`) REFERENCES `users` (`id`) ON DELETE CASCADE
);


Related Topics



Leave a reply



Submit