60

I have the following set up:

In routes I have:

Route::get('articles', 'ArticlesController@index');

The index method in the controller is simply:

public function index()
{
   $articles = Article::all();
   return View('articles.index', compact('articles'));
}

and in the view:

@extends('../app')
@section('content')
<h1>Articles</h1>
<p>
    @foreach($articles as $article)
    <article>
        <h2><a href="{{action('ArticlesController@show', [$article->id])}}">{{$article->title}}</a></h2>
        <p>{{ $article->body }}</p>
    </article>
    @endforeach
</p>
@stop

I attempted to replace the:

$articles = Article::all();

with

$article = Article::latest()->get();

such that I can actually show the articles latest first. I got the error:

FatalErrorException in Str.php line 322:
Maximum execution time of 30 seconds exceeded

and the call stack is:

in Str.php line 322
at FatalErrorException->__construct() in HandleExceptions.php line 131
at HandleExceptions->fatalExceptionFromError() in HandleExceptions.php line 116
at HandleExceptions->handleShutdown() in HandleExceptions.php line 0
at Str::snake() in helpers.php line 561
at snake_case() in ControllerInspector.php line 105
at ControllerInspector->getVerb() in ControllerInspector.php line 78
at ControllerInspector->getMethodData() in ControllerInspector.php line 39
at ControllerInspector->getRoutable() in Router.php line 251
at Router->controller() in Router.php line 226
at Router->controllers() in Facade.php line 210
at Facade::__callStatic() in routes.php line 21
at Route::controllers() in routes.php line 21
in RouteServiceProvider.php line 40

... etc

I have restored the controller method to what it was, but the error persists.

Can you please tell me how I can solve this problem?

Muhammad
  • 6,725
  • 5
  • 47
  • 54
Geordie Gadgie
  • 625
  • 1
  • 5
  • 5
  • what is `id` here? An index of the article array? – Kishor May 17 '15 at 11:25
  • What happens when you try `$article = Article::orderBy('created_at', 'desc')->get();` instead of `$article = Article::latest()->get();` or whatever time stamp field you have? – haakym May 17 '15 at 14:24
  • it turned out that my setup was faulty, xamp with laravel needs to have xdebug disabled for some reason – Geordie Gadgie May 18 '15 at 03:02

14 Answers14

91

The Maximum execution time of 30 seconds exceeded error is not related to Laravel but rather your PHP configuration.

Here is how you can fix it. The setting you will need to change is max_execution_time.

;;;;;;;;;;;;;;;;;;;
; Resource Limits ;
;;;;;;;;;;;;;;;;;;;

max_execution_time = 30     ; Maximum execution time of each script, in seconds
max_input_time = 60 ; Maximum amount of time each script may spend parsing request data
memory_limit = 8M      ; Maximum amount of memory a script may consume (8MB)

You can change the max_execution_time to 300 seconds like max_execution_time = 300

You can find the path of your PHP configuration file in the output of the phpinfo function in the Loaded Configuration File section.

Noman Ur Rehman
  • 6,707
  • 3
  • 24
  • 39
  • Laravel routes on localhost are very very slow. `Route::get('/', function() {die('here')});` shows 'here' string after 5 seconds. PHP7 + Apache. Windows 7 and Ubuntu 14.04 with the same behavior. – Maykonn May 11 '16 at 01:32
  • Updating max_execution_time is not a good solution. try to optimize the query. – Geo Tom May 07 '17 at 07:01
  • 4
    instead of updating the configuration file use `ini_set('max_execution_time', 300);` in required pages. – Geo Tom May 07 '17 at 07:04
  • 3
    @GeoTom The second argument to ini_set() needs to be a string instead of an integer. `ini_set('max_execution_time', '300');` http://php.net/manual/en/function.ini-set.php – Travis Hohl Aug 29 '17 at 12:34
  • 2
    when updating `max_execution_time` to 300, the error now becomes ->> > Maximum execution time of 300 seconds exceeded – Navneet Krishna Oct 25 '17 at 04:21
  • But why in only certain modules? – A4family Jun 25 '21 at 06:07
60

it's a pure PHP setting. The alternative is to increase the execution time limit only for specific php scripts, by inserting on top of that php file, the following:

ini_set('max_execution_time', 180); //3 minutes
Grigoreas P.
  • 2,452
  • 25
  • 19
  • 1
    where should I add this in Laravel, when inserting this in the controller I get an error `Namespace declaration statement has to be the very first statement or after any declare call in the script` – davejal Mar 24 '17 at 13:47
33

In Laravel:

Add set_time_limit(0) line on top of query.

set_time_limit(0);

$users = App\User::all();

It helps you in different large queries but you should need to improve query optimise.

Umar Tariq
  • 1,191
  • 1
  • 13
  • 14
  • 9
    I think you should never do that in production, cause it will do some serious impact to your performance. This code will **execute your code without time limit**. I suggest you to set this timer, e.g. 100 seconds or later. http://php.net/manual/en/function.set-time-limit.php – ibnɘꟻ Mar 01 '19 at 17:09
  • As mentioned setting to 0 is.. questionable at best. Even if you do 100 like the above commenter mentioned, or even 300; Something is better than nothing: `0` here – Cfomodz Mar 10 '22 at 19:42
14

Sometimes you just need to optimize your code or query, Setting more max_execution_time is not a solution.

It is not suggested to take more than 30 for loading a webpage if a task takes more time need to be a queue.

Alireza
  • 1,706
  • 16
  • 23
  • 52
    sometimes you have no choice – ShaneMit Sep 21 '17 at 16:57
  • 2
    @ShaneMit you could run your heavy code in background – Alireza Sep 23 '17 at 08:07
  • 10
    @ Alireza Aboutalebi again, sometimes you have no choice. I have been tossed into legacy systems where I need to have a script run for more than 30 seconds. It happens sometimes so you can't act above a situation, as sometimes you have no choice. – ShaneMit Sep 28 '17 at 22:24
  • @ShaneMit Right – Alireza Oct 03 '17 at 08:03
  • 4
    If your code is heavily reliant on 3rd party API's and then doing a lot of processing you sometimes have no choice as 30 seconds will just no cut it! Of course optimisation helps and of course we can try and separate out tasks but it simply isn't always possible when you come in to a project as a contractor and you have very tight schedules. "This needs rewriting," I say, "No time" comes the reply! :-) – Andy Jan 27 '18 at 13:26
  • That is a case too, I have a code that takes 20 min to get finish but in most cases, we should not change the max_execution_time – Alireza Jan 28 '18 at 13:12
  • Whatever that gets the job done fast... The world we live in, Minimum Viable Product is more important than trying to perfect something that might not suit the market needs yet? – CodeGuru Feb 22 '19 at 10:09
  • @CodeGuru yes,that can be the case too – Alireza Feb 23 '19 at 09:50
10

set time limit in __construct method or you can set in your index controller also where you want to have large time limit.

public function __construct()
{
    set_time_limit(8000000);
}
Mahendra Pratap
  • 3,174
  • 5
  • 23
  • 45
2

try

ini_set('max_execution_time', $time);
$articles = Article::all();

where $time is in seconds, set it to 0 for no time. make sure to make it 60 back after your function finish

2

Add above query

ini_set("memory_limit", "10056M");

sanjay
  • 219
  • 2
  • 3
1

Execution is not related to laravel go to the php.ini file In php.ini file set max_execution_time=360 (time may be variable depends on need) if you want to increase execution of a specific page then write ini_set('max_execution_time',360) at top of page

otherwise in htaccess php_value max_execution_time 360

0

If using PHP7, I would suggest you changing the default value in the public/.htaccess

<IfModule php7_module>
   ...
   php_value max_execution_time 300
   ...
</IfModule>
Kwaye Kant
  • 47
  • 1
  • 6
0

If you are hosting your Laravel app on Heroku, you can create custom_php.ini in the root of your project and simply add max_execution_time = 60

DonKoko
  • 105
  • 1
  • 11
0

I had a similar problem just now. However, this had nothing to do with modifying the php.ini file. It was from a for loop. If you are having nested for loops, make sure you are using the iterator properly. In my case, I was iterating the outer iterator from my inner iterator.

Norseback
  • 193
  • 2
  • 4
  • 16
-1

By default the max execution time set is 30 seconds. I'm facing this issue when I load a page which is quite big with huge data coming from database. Other factors will cause this issue too like big images, lot of javascript files and big icons. I fixed this issue by adding this line in the entrypoint of laravel application public\index.php

ini_set('max_execution_time', '60'); 

I did this for my entire laravel application. But in your case, you can do this on certain pages which are causing this problem. Just do write that line on start of php script for particular page

Danish Mehmood
  • 111
  • 1
  • 7
-4

Options -MultiViews -Indexes

RewriteEngine On

# Handle Authorization Header
RewriteCond %{HTTP:Authorization} .
RewriteRule .* - [E=HTTP_AUTHORIZATION:%{HTTP:Authorization}]

# Redirect Trailing Slashes If Not A Folder...
RewriteCond %{REQUEST_FILENAME} !-d
RewriteCond %{REQUEST_URI} (.+)/$
RewriteRule ^ %1 [L,R=301]

# Handle Front Controller...
RewriteCond %{REQUEST_FILENAME} !-d
RewriteCond %{REQUEST_FILENAME} !-f
RewriteRule ^ index.php [L]

#cambiamos el valor para subir archivos
php_value memory_limit 400M
php_value post_max_size 400M
php_value upload_max_filesize 400M
php_value max_execution_time 300 #esta es la linea que necesitas agregar.

-23

You need to just press CTRL + F5. It will work after that.

Bhargav Rao
  • 50,140
  • 28
  • 121
  • 140
Sandro Cagara
  • 39
  • 1
  • 1