How to Solve a Timeout Error in Laravel 5

How to solve a timeout error in Laravel 5

The Maximum execution time of 30 seconds exceeded error is not related to Laravel but rather your PHP configuration.

Here is how you can fix it. The setting you will need to change is max_execution_time.

;;;;;;;;;;;;;;;;;;;
; Resource Limits ;
;;;;;;;;;;;;;;;;;;;

max_execution_time = 30 ; Maximum execution time of each script, in seconds
max_input_time = 60 ; Maximum amount of time each script may spend parsing request data
memory_limit = 8M ; Maximum amount of memory a script may consume (8MB)

You can change the max_execution_time to 300 seconds like max_execution_time = 300

You can find the path of your PHP configuration file in the output of the phpinfo function in the Loaded Configuration File section.

Laravel 5.6 - Timeout error after framework update

Thanks to @migser from Slack official channel, I found the cause.

No update of any sort ruined my stack, but a bad use of touches array inside both two hasMany related models were freezing the called tasks.

By removing the wrong touches declaration - the one present inside model with relation declaration - all crud tasks works fine again.

Thanks everyone for the tips and help.

Laravel Timeout Issue

After a lot of debugging, I figured it out. Apache was timing out. Apparently, when Apache times out, it throws a 500 response code. Apparently (again), when a browser gets a 500 error code to a POST request, it resends it as a GET request. I wrote it up here in more detail: http://blog.voltampmedia.com/2014/09/02/php-apache-timeouts-post-requests/

To be clear, its not a Laravel issue. Do note that the Whoops library does capture the timeout error.

Laravel 5.3: What is causing 'maximum execution time of 30 seconds exceeded'

It's not a specific operation running out of time. It's... everything, combined, from start to finish.

max_execution_time integer

This sets the maximum time in seconds a script is allowed to run before it is terminated by the parser. This helps prevent poorly written scripts from tying up the server. The default setting is 30.

http://php.net/manual/en/info.configuration.php#ini.max-execution-time

The idea, here, is that for a web service, generally speaking, only a certain amount of time from request to response is reasonable. Obviously, if it takes 30 seconds (an arbitrary number for "reasonableness") to return a response to a web browser or from an API, something probably isn't working as intended. A lot of requests tying up server resources would result in a server becoming unresponsive to any subsequent requests, taking the entire site down.

The max_execution_time parameter is a protective control to mitigate the degradation of a site when a script -- for example -- gets stuck in an endless loop or otherwise runs for an unreasonable amount of time. The script execution is terminated, freeing resources that were being consumed, usually in an unproductive way.

Is the time aggregate of all requests by a method?

It's the total runtime time for everything in the script -- not one specific operation.

Does memory overload cause this?

Not typically, except perhaps when the system is constrained for memory and uses a swap file, since swap thrashing can consume a great deal of time.

Will it help by chunking the data and handling it through multiple request?

In this case, yes, it may make sense to work with smaller batches, which (generally speaking) should reduce the runtime. Everything is a tradeoff, as larger batches may or may not be more efficient, in terms of proccessing time per unit of work, which is workload-specific and rarely linear.

Laravel queue process timeout error

Adding --timeout=0 worked for my set up.

UPDATE:
The entire command would therefore be php artisan queue:listen --timeout=0.

Hope this helps.

Laravel: How to configure eloqent to throw exception on database timeout?

The reason you are getting a 504 Gateway Time-out error is because when you perform a query with a database that does not exist, it takes a very long time (around 2mins for me).

I suspect your max execution time by your web-server or php config is lower than that, and therefore generates a 504 Gateway Time-out error.

There is 2 ways to fix this:

Increase your max-execution time in your server & PHP config

Increase your max execution time in your server by adjusting your server config file:

  • Nginx: https://ubiq.co/tech-blog/increase-request-timeout-nginx/
  • Apache: https://ubiq.co/tech-blog/increase-request-timeout-apache/

Also increase your max_execution_time time in your php.ini file (make sure to change the one used by the web-server, not just the CLI one). or add this at the start of index.php:

ini_set('max_execution_time', 1200); // 1200 seconds

Reduce the PDO timeout value

Use the following options:

 'options'   => [PDO::ATTR_TIMEOUT=> 10, // timeout in seconds]

Note that the actual timeout was often larger than the specified timeout for some reason. My tests showed that having a timeout of 10 seconds, actually timed out at 40sec. And 20 seconds timed out at 80sec. I don't know why this is. Your experience may vary.



Related Topics



Leave a reply



Submit