How to Increase Maximum Execution Time in Laravel

Increase max_execution_time in laravel

after changing in php.ini you have to restart your web server.

if you restarted it and nothing happened make sure you are changing in the right php.ini it is possible to have more than one in your operating system.

to avoid this you can make some changes in your .htaccess inside the public folder.

the change is:

<IfModule mod_php5.c>
php_value max_execution_time 300
</IfModule>

when you change in your .htaccess you dont have to restart your webserver.

Note: you should have mod php5.c not commented

if you are insisting on making this change in the php.ini you can write <?php phpinfo(); this will show you the location of your php.ini with all the information about the php version installed on your server

How to solve a timeout error in Laravel 5

The Maximum execution time of 30 seconds exceeded error is not related to Laravel but rather your PHP configuration.

Here is how you can fix it. The setting you will need to change is max_execution_time.

;;;;;;;;;;;;;;;;;;;
; Resource Limits ;
;;;;;;;;;;;;;;;;;;;

max_execution_time = 30 ; Maximum execution time of each script, in seconds
max_input_time = 60 ; Maximum amount of time each script may spend parsing request data
memory_limit = 8M ; Maximum amount of memory a script may consume (8MB)

You can change the max_execution_time to 300 seconds like max_execution_time = 300

You can find the path of your PHP configuration file in the output of the phpinfo function in the Loaded Configuration File section.

How to avoid Maximum execution time of 60 seconds exceeded Laravel without change php.ini max_execution_time

Maybe this query returns so many elements PHP spends most of the time just wrapping a Collection object around them. If you want to see how much time is spent on the query itself, you could run it directly on your PostgreSQL Server, the console (php artisan tinker) or use DB::listen in your code

public function exportAll(Request $request)
{
// PHP >= 7.4.0
DB::listen(fn($query) => dump($query->sql, $query->bindings, $query->time));
// PHP < 7.4.0
DB::listen(function ($query) { dump($query->sql, $query->bindings, $query->time); });
...
}

If the Collection wrapping is the issue, try using a LazyCollection. It's available since Laravel 6.0. You use it by calling $data->cursor() instead of $data->get().

A LazyCollection is basically an object you can iterate over and use some Collection methods on. They allow you to work with the data without the overhead of building a big Collection for X amount of rows.

  • More information on Lazy Collections

I'll repost your exportAll function with some changes I think will positively impact performance.

public function exportAll(Request $request)
{
$data = AssetRepository::query(); //From AssetRepository Function

$headers = array(
'Content-Type' => 'text/csv',
'Cache-Control' => 'must-revalidate, post-check=0, pre-check=0',
'Content-Disposition' => 'attachment; filename=export.csv',
'Expires' => '0',
'Pragma' => 'public',
);

$response = new StreamedResponse(function () use ($data) {
$handle = fopen('php://output', 'w');
/**
* Use a LazyCollection instead
* $getData = $data->get();
*/
$getData = $data->cursor();
$remark = Remark::all(['id','label','type']);
$remarkAsset = RemarkAsset::all(['asset_id','value','remark_id']);
/**
* Since we are using a LazyCollection,
* we can't treat $getData as an array directly.
*
* $getHeader = array_keys((array)$getData[0]);
*/
$getHeader = array_keys((array)$getData->get(0));
$newArray = array();
/**
* This can be achieved with array_combine
*
* $setHeader = array();
*
* foreach ($getHeader as $header) {
* $setHeader[$header] = $header;
* }
*/
$setHeader = array_combine($getHeader, $getHeader);
/**
* $remarkHeader is unnecesary. You can just call $remark->toArray() instead.
* Also, what you're trying to do with the following foreach can be done with
* a combination of array_merge and array_combine
*
* $remarkHeader = []; //result
*
* foreach ($remark as $headerRemark) {
* $remarkHeader[] = array(
* 'id' => $headerRemark['id'],
* 'label' => $headerRemark['label'],
* 'type' => $headerRemark['type']
* );
*
* $setHeader[$headerRemark['type']] = $headerRemark['type'];
* }
*/
$setHeader = array_merge(
$setHeader,
array_combine(
$remark->pluck('type')->toArray(),
$remark->pluck('type')->toArray()
)
);
/**
* Again, $remarkAssets is unnecessary. All you're doing with this loop
* is the same as calling $remarkAsset->toArray()
*
* $remarkAssets = [];
* foreach ($remarkAsset as $assetRemark) {
* $remarkAssets[] = (array)array(
* 'asset_id' => $assetRemark['asset_id'],
* 'value' => $assetRemark['value'],
* 'remark_id' => $assetRemark['remark_id']
* );
* }
*/
array_push($newArray, (object)$setHeader);
// $coountData = count($getData) / 4;
/**
* $getData is already a Collection. Here, you're telling PHP to rebuild it
* for no reason. For large collections, this adds a lot of overhead.
* You can already call the chunk method on $getData anyways.
* You could do $chunk = $getData->chunk(500) for example.
* It's not even necessary to make a new variable for it since you won't use
* $chunk again after this.
*
* $chunk = collect($getData);
* $chunk->chunk(500);
*
* Also, according to the docs you're not using chunk properly.
* https://laravel.com/docs/6.x/collections#method-chunk
* You're supposed to loop twice because the chunk method doesn't alter the collection.
* If you run
* $chunk->chunk(500)
* foreach($chunk as $data) { ... }
* You're still looping over the entire Collection.
* Since your code is not made to work with chunks, I'll leave it like that
*
* foreach ($chunk as $data) {
*/
foreach ($getData as $data) {
/**
* This seems to return an array of the keys of $remarkAssets
* where 'asset_id' is equal to $data->id.
* You can achieve this through Collection methods on $remarkAsset instead.
*
* $theKey = array_keys(
* array_combine(
* array_keys($remarkAssets),
* array_column($remarkAssets, 'asset_id')
* ),
* $data->id
* );
*
* Since there is no real need to return an array, I'll leave $theKey as a collection.
*/
$theKey = $remarkAsset->where('asset_id', $data->id)->keys();

/**
* Since $remarkHeader doesn't exist in this context, we use $remark instead
*
* foreach ($remarkHeader as $head) {
*
* Since $theKey is a collection, the count is obtained
* through the count() Collection method. Also, since you don't
* ever use $countKey again, you could inline it instead.
*
* $countKey = count($theKey);
*
* if ($countKey > 0) {
*/
foreach ($remark as $head) {
if ($theKey->count() > 0) {
$valueRemark = '';

foreach ($theKey as $key) {
/**
* Since $remark is a collection and $head an object
* the following if statement needs to be rewritten
*
* if ($remarkAssets[$key]['remark_id'] == $head['id']) {
* $valueRemark = $remarkAssets[$key]['value'];
* }
*/
if ($remark->get($key)->remark_id == $head->id) {
$valueRemark = $remark->get($key)->value;
}
}

/**
* $data being a stdClass, you can just set the property instead of
* going through the trouble of casting it as an array, setting a value
* and then re-casting it as an object.
*
* $data = (array)$data;
* $data[$head['type']] = $valueRemark;
* $data = (object)$data;
* } else {
* $data = (array)$data;
* $data[$head['type']] = '';
* $data = (object)$data;
*/
$data->{$head['type']} = $valueRemark;
} else {
$data->{$head['type']} = '';
}
}
array_push($newArray, $data);
}

$chunkArray = collect($newArray);
/**
* As explained earlier, your use of chunk() doesn't do anything.
* We can then safely remove this line.
*
* $chunkArray->chunk(500);
*/

foreach ($chunkArray as $datas) {
if (is_object($datas))
$datas = (array)$datas;
fputcsv($handle, $datas);
}

fclose($handle);
}, 200, $headers);

return $response->send();
}

Without all the comments

public function exportAll(Request $request)
{
$data = AssetRepository::query(); //From AssetRepository Function

$headers = array(
'Content-Type' => 'text/csv',
'Cache-Control' => 'must-revalidate, post-check=0, pre-check=0',
'Content-Disposition' => 'attachment; filename=export.csv',
'Expires' => '0',
'Pragma' => 'public',
);

$response = new StreamedResponse(function () use ($data) {
$handle = fopen('php://output', 'w');
$getData = $data->cursor();
$remark = Remark::all(['id','label','type']);
$remarkAsset = RemarkAsset::all(['asset_id','value','remark_id']);
$getHeader = array_keys((array)$getData->get(0));
$newArray = array();
$setHeader = array_combine($getHeader, $getHeader);
$setHeader = array_merge(
$setHeader,
array_combine(
$remark->pluck('type')->toArray(),
$remark->pluck('type')->toArray()
)
);
array_push($newArray, (object)$setHeader);

foreach ($getData as $data) {
$theKey = $remarkAsset->where('asset_id', $data->id)->keys();

foreach ($remark as $head) {
if ($theKey->count() > 0) {
$valueRemark = '';

foreach ($theKey as $key) {
if ($remark->get($key)->remark_id == $head->id) {
$valueRemark = $remark->get($key)->value;
}
}
$data->{$head['type']} = $valueRemark;
} else {
$data->{$head['type']} = '';
}
}
array_push($newArray, $data);
}

$chunkArray = collect($newArray);

foreach ($chunkArray as $datas) {
if (is_object($datas))
$datas = (array)$datas;
fputcsv($handle, $datas);
}

fclose($handle);
}, 200, $headers);

return $response->send();
}

You could also use Lazy Collections for Remark an RemarkAsset models like so

$remark = Remark::select('id','label','type')->cursor();
$remarkAsset = RemarkAsset::select('asset_id','value','remark_id')->cursor();

Laravel Maximum execution time of 60 seconds exceeded

For some reason, the error didn't appear when I closed my Sqlite DB browser desktop program. Yet this is very strange, since I had the sqlite db browser app open since the beginning of my app creation.

Maximum execution time of 30 seconds exceeded Laravel 4 error

The problem was actually in the wifi I was using. I dissconnected from it and connected to another one and everything worked just fine. I have never had this issue where a wifi will not let the localhost send an email. Thanks for all the help!

Laravel 5.3: What is causing 'maximum execution time of 30 seconds exceeded'

It's not a specific operation running out of time. It's... everything, combined, from start to finish.

max_execution_time integer

This sets the maximum time in seconds a script is allowed to run before it is terminated by the parser. This helps prevent poorly written scripts from tying up the server. The default setting is 30.

http://php.net/manual/en/info.configuration.php#ini.max-execution-time

The idea, here, is that for a web service, generally speaking, only a certain amount of time from request to response is reasonable. Obviously, if it takes 30 seconds (an arbitrary number for "reasonableness") to return a response to a web browser or from an API, something probably isn't working as intended. A lot of requests tying up server resources would result in a server becoming unresponsive to any subsequent requests, taking the entire site down.

The max_execution_time parameter is a protective control to mitigate the degradation of a site when a script -- for example -- gets stuck in an endless loop or otherwise runs for an unreasonable amount of time. The script execution is terminated, freeing resources that were being consumed, usually in an unproductive way.

Is the time aggregate of all requests by a method?

It's the total runtime time for everything in the script -- not one specific operation.

Does memory overload cause this?

Not typically, except perhaps when the system is constrained for memory and uses a swap file, since swap thrashing can consume a great deal of time.

Will it help by chunking the data and handling it through multiple request?

In this case, yes, it may make sense to work with smaller batches, which (generally speaking) should reduce the runtime. Everything is a tradeoff, as larger batches may or may not be more efficient, in terms of proccessing time per unit of work, which is workload-specific and rarely linear.

How to dynamically set max_execution_time based on file size in php

It's not recommended to change the maximum execution time during the execution of the job. instead, you can dispatch the job into a queue to run time-consuming tasks such as downloading.

then you can run queue worker command followed by --timeout=0.

php artisan queue:listen --timeout=0


Related Topics



Leave a reply



Submit