7

When using the php -l myFile.php command (PHP 5.5.30), if the file has a syntax error then I get the proper warnings and stack trace, etc.

However, if the file has no syntax warnings I get the message

No syntax errors detected in myFile.php

Is there a way to have the command have no output when the syntax is valid? I only care if a file has invalid syntax - I don't need a message saying it's valid.

Matthew Herbst
  • 29,477
  • 23
  • 85
  • 128
  • I normally set errors to true in php.ini for development or directly on that php file if in productionand just run php -f PHPFILENAME.php – Binary101010 Dec 11 '15 at 00:27
  • 3
    @Binary101010 That's not the same by any means. `-l` just checks for syntax errors. Why would you run the script, potentially making changes to files or databases, throwing errors if an included/required file is not available or any other number of errors. Just to check if a files syntax is correct. – Jonathan Kuhn Dec 11 '15 at 00:30

4 Answers4

9

The "no syntax errors..." message is sent out on the stdout while the syntax errors are sent out on stderr. You can redirect those to somewhere like /dev/null if you don't want them.

php -l file.php 1> /dev/null

that will output the errors if there were any or nothing if no errors. You do lose the "Errors parsing..." message, but will get the errors if there was a problem.

Jonathan Kuhn
  • 15,279
  • 3
  • 32
  • 43
  • I like it. Kind of hacky but it gets the job done. I'm going to wait a bit to see if anyone has any idea how to stop output in the first place, but this is good, thanks! – Matthew Herbst Dec 11 '15 at 01:48
  • 1
    I have a bash script that I use that checks a files status like this before creating a patch from our version control system. Essentially what I do is what Sammitch is suggesting and checking `if !php -l file.php &> /dev/null; then` and then if that is true, call `php -l file.php` again so it outputs whatever errors there were. Also, this isn't "hacky". It is pretty standard for bash scripts to route output around. – Jonathan Kuhn Dec 11 '15 at 06:57
  • Is calling `php -l` twice better practice than just moving the output of the first call into a temp file and then dumping that file if the not-call returned true? – Matthew Herbst Dec 11 '15 at 17:10
  • I don't know if it makes much of a difference. At least it doesn't in my case. The script I have that runs those lines is for making patches from a version control system. So if I'm being a good programmer and testing as I go along, I'll never have to run it twice. It's only there if I make a mistake after testing that I don't catch before checking in a change. – Jonathan Kuhn Dec 11 '15 at 17:57
  • Interesting. I run the linter as part of my build process - the tests won't even be executed if the lint fails (and that failure will kill a pr-deploy also). – Matthew Herbst Dec 11 '15 at 19:24
  • Is this answer still valid? Does not seem to work with my php 5.6 install. – Dave Kok Jun 07 '16 at 12:36
  • @DaveKok Should be. I doubt they changed the output/channels and on linux you can always route output around with `anyBashCommand > someFile.txt`. How is it not working for you? – Jonathan Kuhn Jun 07 '16 at 17:16
  • @JonathanKuhn Well error messages are not send to stderr but stdout. – Dave Kok Jun 08 '16 at 10:17
  • @DaveKok the answer routes stdout to `/dev/null`, not stderr. You can get php to route errors to stderr pretty easy setting the `display_errors` directive (http://php.net/errorfunc.configuration#ini.display-errors) which can be set to `stderr` with `ini_set()` to redirect errors within php. You can see from my comment above, the way I actually use this is to first redirect both out and err to /dev/null in an if statement and if the process returned an error code then run it again to output the errors and exit a bash script. – Jonathan Kuhn Jun 08 '16 at 17:32
  • @JonathanKuhn I don't think you can use ini_set with php -l but changing the php.ini file does work. Strange behavior though. You would expect stderr to be the default setting. But thanks for pointing it out. – Dave Kok Jun 09 '16 at 08:13
3

You can use chronic to suppress all output if the command succeeds (returns 0):

chronic php -l myFile.php

DESCRIPTION

chronic runs a command, and arranges for its standard out and standard error to only be displayed if the command fails (exits nonzero or crashes). If the command succeeds, any extraneous output will be hidden.

On Debian, it's in the moreutils package.

Community
  • 1
  • 1
Toby Speight
  • 27,591
  • 48
  • 66
  • 103
1

Don't check the output, check the return code.

$ php -l good.php &> /dev/null; echo $?
0

$ php -l bad.php &> /dev/null; echo $?
255

So:

if ! php -l somescript.php &> /dev/null; then
  echo 'OH NOES!'
fi

Or if you're feelin fancy:

if ! foo=$(php -l somescript.php 2>&1); then
  echo $foo
fi
Sammitch
  • 30,782
  • 7
  • 50
  • 77
  • I'm not checking the output or the return code. I'm running a lint task as part of my gulp build process. The `No syntax errors detected in myFile.php` message is showing up in the npm build output/logs, making them very cluttered. I'd like to prevent them from being there. – Matthew Herbst Dec 11 '15 at 01:47
-1
php -ln script.php >/dev/null || php -ln script.php

EDIT:

chronic php -ln script.php
Kit
  • 79
  • 1
  • 4