12

I always (well try) to comment my code. I have configured my server to remove those comments/extra white space before delivery. Would it be better not to have comments in the live systems code (Javascript/php) and therefore reduce this overhead or remove or interpretation?

If so how can I have my cake and eat it too?

Ed Heal
  • 59,252
  • 17
  • 87
  • 127
  • 1
    It sounds like you already have it working. What exactly is the problem? – mellamokb Oct 11 '11 at 18:51
  • I would still comment my code, and use a minifier to minify and remove comments, when delivering it. – Mahesh Velaga Oct 11 '11 at 18:52
  • Related, see http://stackoverflow.com/questions/2731022/do-comments-slow-down-an-interpreted-language - it doesn't apply fully for the languages you mention, as those don't cache the result of bytecode compilation. But the point about them being nonexistant after the (very fast) lexing stage still stands. Needless to say, even if this overhead is measurable in a microbenchmark, it would be insane to throw away good comments because of it. –  Oct 11 '11 at 18:53

9 Answers9

22

For PHP, it makes no difference. Your PHP code isn't sent out to the browser.

For JavaScript, it is recommended that you minify your code. This reduces its size by changing variable names, removing white space, and removing all comments as well. There are several online tools for doing this, and it is often available in your IDE.

Whatever you do, leave your code commented where you work on it. Don't remove comments from PHP, and don't minify your JS by hand.

Brad
  • 159,648
  • 54
  • 349
  • 530
  • 2
    For PHP - surely the parser needs to see the comments each time and skip over them. i.e. read access from the disc, processing to skip them. – Ed Heal Oct 11 '11 at 18:55
  • 1
    @EdHeal Do you see comments in large, well-known and used libraries? Yes, you do. It is a negligible cost to skip comments. – Levi Morrison Oct 11 '11 at 18:59
  • 1
    @EdHeal, any performance hit would be so small that it it is immeasurable. Go ahead and test it if you don't believe me. For whomever downvoted this, please explain yourself. – Brad Oct 11 '11 at 19:07
  • For JavaScript, it is only recommended to minify because the file has to be downloaded by the client -- and the minified version's transport is faster. But for both PHP and JavaScript, it's true that the parser has to read more bytes if you add comments. Which is a tiny overhead, however. – caw Apr 09 '14 at 19:14
  • 1
    -1, because the OP asked about performance -- irrespective of browsers or network traffic. There is still *some* cost for interpreting (skipping) comments on the server side. The question is about that: how much is that "some". This answer ignores that aspect, which is actually the gist of the issue. – Sz. Feb 10 '17 at 13:49
  • @Sz. My answer addresses the performance. It explains that there is no cost. No cost as in no measurable cost. (Try this yourself, it's truly immeasurable.) I explain the minification of JavaScript because it's not clear from the question that the person asking understands *why* they are minifying their JavaScript. It's implied in the question that the reason is parsing speed, which is simply not true here. I'm sorry that my extra explanation to solve the actual problem in the real context of the whole question did not meet your expectations, and that it distracted from my first line. – Brad Feb 11 '17 at 01:59
  • As far as coding goes, small is beautifull. Some spend time commenting even the most obvious lines. You might waste your time trying to minify and remove comments,but you spend even more writting them. I assume some like that feeling of being the teacher wiz4rd explaining his magic to others, but I see really silly/uneeded comments in open-source stuff. When coding php, js and any interpreted language, you should make your code as slim / effective as possible & avoid spending time writting useless comments which, even if just in a really huge traffic setup, can have some performance impact. – DeZeA Nov 18 '17 at 13:11
  • "For PHP, it makes no difference. Your PHP code isn't sent out to the browse " This argument makes no sense to me. When you run a php script there will be a time to first byte towards the visitor. The more intensive the php script, the longer the TTFB. Therefor on php the time it takes to execute a script does matter since it will affect the TTFB of that page. Now if comments slow down TTFB is something that I personally have not tested, but could theoretically happen. – Magistar Sep 08 '19 at 17:09
  • @Magistar Comments and whitespace are going to make no measurable difference at all in the time it's going to take to run that script, and what's what the question is asking about. – Brad Sep 08 '19 at 17:17
  • @Brad I can believe that is true, I just haven't seen an argument for it. As far as my limited knowledge goes the php code is converted to machine code which is then executed and the result is send to the client. So simply stating that the client does not receive the php code ignore the conversion step to machine code. One explanation would be that the conversion itself is not slowed down by comments and whitespaces. But that would have to be tested don't you agree? – Magistar Sep 11 '19 at 15:08
8

Although the general assumption is that having PHP chewing through comments causes no measurable difference, better to check it, right?

(Note: by common sense, we'd expect that the sheer request processing, permission management, process control, dispatching this, delegating that, firing up the PHP runtime environment, managing various caches, fiddling with asset files, overall disk and network I/O etc. etc., oh, and BTW, also executing the code, all very likely add up to far more than any generous amount of comments.)

So I gave it a very unsophisticated go, just to get an instant feel of it.

  1. Setup

Predicting the "comment impact" to be as difficult to detect as neutrinos, I was deliberately after a slightly pathological setup, trying to make the difference measurable, but still not be overly unrealistic.

I created two files. One with no comments, just ~100 bytes, straight to the point, no-comments.php:

<?php
function task() {
    ++$GLOBALS;
    echo "[$GLOBALS] Lorem ipsum dolor sit amet cosectetur...\n";
}

And another, ~60K (staying under 64K just for heap-management-related superstition), comments.php:

<?php
/* ... some 30K comments ... */
// OK, that's something, but how about:
/* ... same 30K comments again ... (Phantomjs changelog, for the curious of you. :) ) */
// Finally, do something:
function task() {
    ++$GLOBALS; // Comments are cheap, so let me tell you how much I enjoyed this instead of properly declaring a counter. :)
    echo "[$GLOBALS] Lorem ipsum with a lot of comments...\n";
}

Note: this would of course very likely test file size impact actually, not purely the comments, but that's always an inherent part of the "comments (non)issue" anyway, and also I wanted just something first. Perhaps even that's already unmeasurable, right?

The general idea was then to loop task() in various ways, just a bit (or none at all) from inside the same PHP process, and a lot from outside of it, via separate executions, to force reparsing, which is the only interesting part of this experiment.

For quickest results I did some shell runs:

#!/bin/bash
for (( i = 0; i < 1000; i++ ))
do
   php comments.php  # <-- and another batch with "no-comments.php"
done

But that turned out to be unreliable, as increasing the loop count caused inexplicable and disproportional changes in execution times. I switched to a PHP runner instead, which ran more smoothly:

#!/usr/bin/php
<?php
$t1 = microtime(true);
for ($i = 0; $i < 1000; ++$i ) {
        system("php comments.php"); // <-- and with "no-comments.php"
}
$t2 = microtime(true);
echo "Time: ", $t2 - $t1

For HTTP runs I then added this index.php:

<?php
$GLOBALS = 0; // innovative use of a dull language feature ;)

$t1 = microtime(true);

require_once (isset($_GET['no']) ? 'no-' : '') . 'comments.php';

// Played a bit with looping here, but ended up leaving it out.
// for ($i = 0; $i < 3; ++$i) {
//      task();
//      echo '<br>';
// }

$t2 = microtime(true);
echo "<hr>Time: ",  number_format($t2 - $t1, 10);

Note: at first, unfortunately, I left PHP's Zend Opcache enabled, and wasted a lot of time trying to make sense of the results... ;-o Then i disabled the cache, of course, and repeated the web tests (only).

The host is just vanilla Debian, Apache2 with some PHP5 (I guess it's FPM -- didn't even bother checking, as that's supposed to be orthogonal to the subject of the testing (please correct me if this is not true). It may actually even help exposing the difference by reducing the irrelevant PHP startup overhead masking the tiny comment parsing time.)

  1. Results - shell:

Running PHP-cli was surprisingly slow, so I got quickly bored, after only a dozen loops of 1000 iterations for both variants. (Results in seconds.)

COMMENTS:

44.2015209198
39.710990905762
42.374881982803
36.29861998558
44.764121055603
38.85772395134
42.627450942993
38.342661142349
48.539611816406
39.784120082855
50.34646987915
47.782819032669
36.974604845047
45.692447900772

AVERAGE: 42.592717

NO COMMENTS:

45.617978811264
43.397685050964
46.341667175293
44.246716976166
40.348230838776
43.048954963684
38.57627081871
50.429704189301
41.811543226242
35.755078077316
53.086957931519
31.751699924469
48.388355970383
49.540207862854

AVERAGE: 43.738647

As you can see, it's all rubbish... But if we ignore the environmental fluctuations, the conclusion is use more comments, it'll make your script faster! :)

  1. Results - HTTP, Zend Opcache enabled:

(Some noise was cut from the ab outputs.)

COMMENTS:

ab -qd -n 10000 'http://.../comments/?yes'

Server Software:        Apache/2.4.10
Concurrency Level:      1
Time taken for tests:   3.158 seconds
Complete requests:      10000
Failed requests:        0
Non-2xx responses:      10000
Total transferred:      7120000 bytes
HTML transferred:       4620000 bytes
Requests per second:    3166.12 [#/sec] (mean)
Time per request:       0.316 [ms] (mean)
Transfer rate:          2201.45 [Kbytes/sec] received

NO COMMENTS:

ab -qd -n 10000 'http://.../comments/?no'

Server Software:        Apache/2.4.10
Concurrency Level:      1
Time taken for tests:   3.367 seconds
Complete requests:      10000
Failed requests:        0
Non-2xx responses:      10000
Total transferred:      7120000 bytes
HTML transferred:       4620000 bytes
Requests per second:    2969.95 [#/sec] (mean)
Time per request:       0.337 [ms] (mean)
Transfer rate:          2065.04 [Kbytes/sec] received

Wow! :-o Just like the shell runs! :) OK, not believing my eyes, I repeated it a few more times, until it made sense... :) See? Here:

Benchmarking ...<"NO COMMENTS">... (be patient).....done

Time taken for tests:   2.912 seconds
Total transferred:      7120000 bytes
HTML transferred:       4620000 bytes
Requests per second:    3433.87 [#/sec] (mean)
Time per request:       0.291 [ms] (mean)
Transfer rate:          2387.61 [Kbytes/sec] received

(BTW, don't ask me, why the non-2xx responses. They were 200 OK via the web.)

Then, with ten times more iterations:

COMMENTS:

Time taken for tests:   32.499 seconds
Requests per second:    3077.04 [#/sec] (mean)
Time per request:       0.325 [ms] (mean)
Transfer rate:          2139.51 [Kbytes/sec] received

NO COMMENTS:

Time taken for tests:   28.257 seconds
Requests per second:    3538.92 [#/sec] (mean)
Time per request:       0.283 [ms] (mean)
Transfer rate:          2460.66 [Kbytes/sec] received

Phew, perfect! Comments are evil! ;)

Well, I still did a couple more, and I can only show you this no-comment result strictly off the record:

Time taken for tests:   37.399 seconds
Requests per second:    2673.84 [#/sec] (mean)
Time per request:       0.374 [ms] (mean)
Transfer rate:          1859.15 [Kbytes/sec] received
  1. Results - HTTP, Zend Opcache DISABLED:

OK, after realizing that I left the cache on, I commented out the extension from the PHP-FPM config (so, indeed, that's what runs here), restarted the services, checked phpinfo(), and gathered the new results:

COMMENTS:

Time taken for tests:   34.756 seconds
Requests per second:    2877.23 [#/sec] (mean)
Time per request:       0.348 [ms] (mean)
Transfer rate:          2000.58 [Kbytes/sec] received

Once again:

Time taken for tests:   31.170 seconds
Requests per second:    3208.24 [#/sec] (mean)
Time per request:       0.312 [ms] (mean)
Transfer rate:          2230.73 [Kbytes/sec] received

NO COMMENTS:

Time taken for tests:   30.060 seconds
Requests per second:    3326.70 [#/sec] (mean)
Time per request:       0.301 [ms] (mean)
Transfer rate:          2313.10 [Kbytes/sec] received

Once again:

Time taken for tests:   32.990 seconds
Requests per second:    3031.23 [#/sec] (mean)
Time per request:       0.330 [ms] (mean)
Transfer rate:          2107.65 [Kbytes/sec] received

Well. As you can see, basically: no freaking difference from the opcache on/off state! Nor between comments on/off (apart from a tiny hint, but having seen the fluctuations...)! :-o

  1. Conclusion

So... Finally, numbers! Well, useless garbage, as a matter of fact, but at least not just religions speculations. It feels a lot better being confused for the sound reason of confusing data than the lack of it! :)

Now, after I've certainly wasted more than enough time on this, the answer to the ages-old question of "how much comments cost", remains a mystery.

As neutrinos have (incredibly) been detected for years, we may justly start feeling embarrassed. Will someone eventually bring on the breakthrough and finally detect the PHP comment impact, too? Nobody knows...

Sz.
  • 3,342
  • 1
  • 30
  • 43
  • 1
    Incredibly thorough answer. While comments are of course extremely minimal to performance, they are a step to skip for the interpreter and it's useful to see *proof* that comments are negligible, rather than assumptions. – Goose Apr 24 '17 at 17:42
  • Thanks, it was kinda fun. :) While good as an approximation, I'd say don't trust the results blindly, they are just too vague. In case I'd happen to live forever, I might repeat it with a) much bigger files (I realized too late that it would expose the difference better), and b) in a more controlled environment. – Sz. Apr 25 '17 at 19:40
  • The variation in your results is too high. Before you can conclude anything you need to get a decent standard deviation. Right now there is no regular distribution that one would expect. When I have time I can try to run this in a container with limited resources so it should stabilize. – Magistar Sep 11 '19 at 15:12
  • @Magistar, yes, thank you, indeed, good point, and even though I didn't strive to be properly rigorous, I did start to regret it eventually, wishing I could be more prepared for the task, but common sense (or perhaps just lack of more patience) had put an end to the effort eventually... – Sz. Dec 14 '22 at 10:35
2

Yes it has an impact! There is NO doubt about it.

Each time PHP must interpret a code that is NOT somehow cached, the I/O operation takes longer if it needs to read more data from disk.

The interpretation itself (if NOT cached one way or another) takes longer too.

The performance penalty is very much depending on the file system and caches in use. It may not be that important in your specific case.

In a web framework that we have written, when we package the distribution files for use in a production environment, we specifically remove all the comments as to make sure that LIVE apps get not penalized by our many comments (typically, the source file of our "String" routines accounts for about 169Kb before removing the comments, and only for 46Kb after treatment).

We have abandoned trying to measure the real penalty as it was impossible to cope with the variety of environments, file systems and caching mechanisms. We therefore have decided to distribute our code in 2 flavors: with comments and without comments.

user2816761
  • 81
  • 1
  • 1
  • Good point, but you exercise the same fallacy as the submitter of the accepted answer, who advocated the opposite stance. Without numbers, it's just... small talk. – Sz. Feb 10 '17 at 13:59
2

If you want to improve the performance of your PHP-Application then you should use a bytecode-cache like XCache or APC.

That way the server does not have to parse the PHP-Code on each request. Of course your server has to support that kind of extension.

As for removing the comments: I'm not sure that this makes a huge difference (except your comments are huge).

vstm
  • 12,407
  • 1
  • 51
  • 47
0

Quite obviously that can make a difference on HUGE traffic, but absolutelly negligible on most setups. Think about a website where you have like 1mil. concurrent connections and the web application (ie a CMS like Joomla which has many php files and large portions of comments) requests for each connection those several heavily commented and indented php files. Will minifing every php file of the application make a difference? I guess it definitely might if you don't have any kind of caching enabled. It's just basic I/O stuff, the smaller you make your file, the less memory would be required to load/process it.

DeZeA
  • 417
  • 5
  • 9
0

you can have comments in your php files, but i dont recommend to use tons of comments in Javascript.

PHP is running on the serverside, so the server can handle php files easy with comments.

Racooon
  • 1,468
  • 2
  • 10
  • 28
  • 5
    Ridiculous. Show me a benchmark that shows comments in JS having any sort of noticeable impact at all. All it does is increase the file size for download. It has no impact on execution time. – Brad Oct 11 '11 at 18:53
  • 1
    thats what I am talking about, browser needs to load the JS-file before EXECUTE it. – Racooon Oct 11 '11 at 18:55
  • And even *if* comments were a massive performance drain (which they aren't, try it out), you can fix both that and the file size by using minification. Even the dumbest minifiers can strip comments and unneeded whitespace (a student could write one that does it). Saying "don't use comments" is even worse than the most ridiculous micro-optimization. –  Oct 11 '11 at 18:56
  • @VuralAcar, prior to your edit, you were talking about running JavaScript, not downloading it. Since you have edited, I will remove my downvote. – Brad Oct 11 '11 at 19:08
  • @delnan and Brad, Thank you for your corrections and tipps! I actually didnt want to say, "Dont Use Comments", I just wanted to talk about load-time. Sorry about it. – Racooon Oct 11 '11 at 19:17
0

It makes a difference in JavaScript since you want to send less data to the browser, but in php it just does not matter. There is no performance penalty for comments since compiler ignores them. For Javascript you would want to have a copy of your normal commented .js file but them always run it through minifier and create yourscript-min.js for production.

When you need to make changes to your script just change your normal script then re-create the minified version. Only use minified version in production.

Again, for php it does not matter, only for Javascript and also for html files.

Dmitri Snytkine
  • 1,096
  • 1
  • 8
  • 14
0

Its better remove all js files comments and even use a minifier tool on then. Decreasing the js files sizes decreases the page load time on the client (since he needs to download then) and costs less bandwidth (regard who pays for then).

0

If you have something configured on your system to "compress" your javascript on the fly there are a few gotchyas in doing this. I've actually implemented this with .htaccess myself and you can have HUGE performance gains and have commented code on the server itself.

I used google's closure tools (jar file on the server) and run closure if the md5_file() in PHP comes up as different.

Next, I used etags to assign a tag to that file. I also cache that file.

I also return a 304 not modified when the etag matches. If it doesnt then I return the new file and update the users etag. This is CRITICAL because if you return a 200/OK you're passing back the whole file again.

The key here is that you lose performance if you compress on the fly because you're always compressing and running PHP code. You can implement it correctly if you spend the time to do it. I personally like the technique because I can patch live server code without sending up a non-minified version. The performance of the "first run" of this technique is slow but subsequent users pull down a cached file on the server and then I return the 304 not modified thereafter. You have to do all this magic in your compressing PHP file.

I mention .htaccess too in here because I use a re-write rule in there and tell the website which files to compress and which not to. e.g. mylibrary.jsc tells my website to compress it with closure. yourlibrary.js allows me to have other .js files out there and compress on demand.

Mech
  • 2,904
  • 3
  • 24
  • 29