2006

We've got a PHP application and want to count all the lines of code under a specific directory and its subdirectories.

We don't need to ignore comments, as we're just trying to get a rough idea.

wc -l *.php 

That command works great for a given directory, but it ignores subdirectories. I was thinking the following comment might work, but it is returning 74, which is definitely not the case...

find . -name '*.php' | wc -l

What's the correct syntax to feed in all the files from a directory resursively?

Exploring
  • 2,493
  • 11
  • 56
  • 97
user77413
  • 30,205
  • 16
  • 46
  • 52

51 Answers51

3207

Try:

find . -name '*.php' | xargs wc -l

or (when file names include special characters such as spaces)

find . -name '*.php' | sed 's/.*/"&"/' | xargs  wc -l

The SLOCCount tool may help as well.

It will give an accurate source lines of code count for whatever hierarchy you point it at, as well as some additional stats.

Sorted output:

find . -name '*.php' | xargs wc -l | sort -nr

Jarl
  • 2,831
  • 4
  • 24
  • 31
Peter Elespuru
  • 32,485
  • 1
  • 17
  • 6
  • 46
    http://cloc.sourceforge.net/ might be worth looking as an alternative to sloccount (more languages but less informations) – AsTeR May 17 '12 at 22:46
  • 38
    with include files also: `find . -name '*.php' -o -name '*.inc' | xargs wc -l` – rymo Jul 24 '12 at 13:32
  • 61
    This will print more than one number when there are many files (because `wc` will be run multiple times. Also doesn't handle many special file names. – l0b0 Apr 23 '13 at 11:56
  • 1
    Is there a way to exclude a directory (e.g ./tests) from the count? – iddober May 21 '13 at 04:25
  • 4
    This should exclude blank lines `$ find . -name '*.php' | xargs cat | awk '/[a-zA-Z0-9]/ {i++} END{print i}'` – steampowered Jul 25 '13 at 02:00
  • 51
    @idober: `find . -name "*.php" -not -path "./tests*" | xargs wc -l` – endre Oct 19 '13 at 09:32
  • pay attention. wc counts line ending codes, not lines. I described this later in this post. use grep ^ , not wc -l . If you need, grep can filter out with counting lines that have only white chars. – Znik Dec 20 '13 at 08:57
  • 25
    If a directory name contains any spaces... the above command fails!! – nitish712 Mar 01 '14 at 17:05
  • 3
    This should be a work-around for that: `find . -name '*.php' | awk '{gsub(" ","\\ ", $0);print $0}' |xargs wc -l` – nitish712 Mar 01 '14 at 17:11
  • 3
    As an aside, xargs isn't available natively in the Windows shell, but if you happen to have git installed then this solution works great in git bash. – yoyo May 12 '14 at 21:08
  • 8
    @nitish712 find and xargs handle the space issue using `-print0` and `-0` options, respectively. So you can try something like `find . -name "*.php" -print0 | xargs -0 wc -l` – Doug Richardson Sep 09 '14 at 18:46
  • @l0b0 does that mean the last total will be inaccurate or just displayed multiple times? – Daniel Kaplan Sep 10 '14 at 20:56
  • 1
    @tieTYT It will be inaccurate, because it will chunk files together and count the lines in each chunk, without summarising. – l0b0 Sep 10 '14 at 21:36
  • 1
    @l0b0 ah, then (no offense intended) this seems like an inferior solution compared to Shizzmo's – Daniel Kaplan Sep 10 '14 at 21:38
  • @tieTYT Technically speaking, yes, but it's POSIX compatible (`-print0` is not), shorter and it'll work unless you have very many files. – l0b0 Sep 11 '14 at 07:12
  • For iOS app: `find . -name '*.[h|m|plist]' | xargs wc -l` – Ascendant Oct 26 '14 at 04:50
  • 3
    `sudo apt-get install sloccount` on Ubuntu. Then `$ sloccount mydir` – vcardillo Jan 27 '15 at 22:43
  • @pyeleven how do I exclude multiple folders? – rclai May 19 '15 at 12:40
  • 1
    @l0b0 for a big number of files it should work with a `cat` inbetween: `find . -name '*.php' | xargs cat | wc -l` – trudolf Aug 17 '15 at 04:03
  • I am sorry but i would like to understand why would using xargs work and without xargs it would not work – aceminer Nov 02 '15 at 05:36
  • 1
    @idober [-prune](http://manpages.ubuntu.com/manpages/precise/en/man1/find.1.html), e.g. `find . -name tests -prune -o -type f -name '*.php' | xargs wc -l` – jalanb Jan 25 '16 at 19:19
  • How would I write a function to shortcut this command in bash? In my .bash_profile I currently have: function lncnt { find . -name $1 | xargs wc -l; } but when I run the command it only counts lines for the first file alphabetically. – nhyne Feb 17 '16 at 17:08
  • What is the difference between piping the output and using xargs? – Boyan Apr 11 '16 at 12:03
  • Following up on @vcardillo 's comment, if you are on OSX and have Homebrew installed, you can run: `brew install sloccount` and then `sloccount mydir` – Axe Jun 17 '16 at 16:20
  • 1
    BTW it seems like cloc has now moved to [github](https://github.com/AlDanial/cloc) – superjos Sep 27 '16 at 10:17
  • 1
    I know question is about PHP, but i bet people are coming for other languages too. To simpler copy-paste experiences here are couple lines. JS: `find . -name '*.js' | xargs wc -l` CSS: `find . -name '*.css' | xargs wc -l` JSX: `find . -name '*.jsx' | xargs wc -l` – Lukas Liesis Nov 23 '16 at 20:22
  • This was not accurate for me, compare the results of this to Michael Wild. For me, Michael Wild's command gave the accurate result. – Goose Jan 04 '17 at 15:26
  • I'm surprised there's no simple solution here for the "path has a space in it" problem. This is what I ended up with: `find . -name '*.swift' -exec wc -l {} \+`. The `+` at the end means that `wc` is run just once, which is fast! – ephemer Nov 30 '17 at 16:06
  • because of command line limits this can generate multiple totals which need to be added, try using | xargs cat | wc -l to overcome that. – Peter Quiring Jun 23 '19 at 23:55
  • At least on my machine (Ubuntu 18.04), I have `sort (GNU coreutils) 8.28` which doesn't do what expected with `find . -name '*.php' | xargs wc -l | sort`, because it sorts `wc`'s output as text. In order to fix that I had to use `sort -n` instead. Maybe other versions of `sort` behave differently. – vvaltchev Sep 12 '19 at 22:58
  • `xargs` hasn't been necessary for a few decades now. Am I wrong? – Maxim Egorushkin Aug 27 '20 at 22:48
  • Is it possible to do this for all text files, not just ones with a specific extension? – Aaron Franke Oct 01 '20 at 07:04
  • `type -file` or whatever might be nice... – Alexander Mills Nov 12 '22 at 04:49
563

For another one-liner:

( find ./ -name '*.php' -print0 | xargs -0 cat ) | wc -l

It works on names with spaces and only outputs one number.

Peter Mortensen
  • 30,738
  • 21
  • 105
  • 131
Shizzmo
  • 16,231
  • 3
  • 23
  • 15
  • 1
    +1 ditto...searched forever...all the other "find" commands only returned the # of actual files....the -print0 stuff here got the actual line count for me!!! thanks! – Ronedog Feb 26 '11 at 05:10
  • 2
    Best solution I've found. I parameterized the path and filetype and added this code to a script on my path. I plan to use it frequently. – S.C. Jun 18 '12 at 21:29
  • Why are you thanked for `print0`? What does that mean? – Torben Gundtofte-Bruun Dec 31 '13 at 09:26
  • 3
    @TorbenGundtofte-Bruun - see `man find` .. print0 with xargs -0 lets you operate on files that have spaces or other weird characters in their name – Shizzmo Jan 01 '14 at 06:18
  • 2
    @TorbenGundtofte-Bruun - also, the -0 in xargs corresponds to the print0, it's kind of encoding/decoding to handle the spaces. – Tristan Reid Feb 01 '14 at 00:37
  • Is there any way to exclude specific subdirectories with this? – user2992596 Jun 18 '14 at 14:58
  • 8
    If you need more than one name filter, I've found that (at least with the MSYSGit version of find), you need extra parens: `( find . \( -name '*.h' -o -name '*.cpp' \) -print0 | xargs -0 cat ) | wc -l` – Zrax Jul 27 '14 at 19:10
  • 2
    This should be the accepted answer in my opinion since it gives a single number (something the question implied). – Pithikos Aug 12 '14 at 15:19
  • I love how fast this command is on a journaled file system (instant) – Design by Adrian Oct 23 '14 at 13:09
  • @Ronedog: the reason it works is because of the "cat" command that xargs is executing. The -print0 just means that it uses null characters to separate the filenames that are being passed to xargs so that spaces will be handled properly. – jmh Jan 15 '15 at 14:57
  • 1
    @DesignbyAdrian: Journaling helps with crash recovery, not speed. It is likely you are seeing good performance due to caching or a very fast HDD. – jmh Jan 15 '15 at 14:58
  • @Shizzmo your solution does not exclude blank lines. – Michael Mar 31 '15 at 09:04
  • Errors will come if there are recursive directories as it will try to cat out the directories as well... or worse in some linux system cat of directory will return the files within so in total number of lines, the count of number of files will be added – santify Jun 01 '15 at 19:43
  • This one, unlike the accepted answer, works for directory names containing spaces. – Victor Sergienko Jul 20 '16 at 18:51
  • Similarly, I dropped the cat to print out file names and this achieved the same total output for me on OS X: `find . -name '*.php' -print0 | xargs -0 wc -l` – Taylor D. Edmiston Sep 07 '16 at 21:00
  • This approach does not work on real projects. you'd run out of memory – Wisienkas Dec 19 '17 at 15:09
  • `( find ./ -name '*.java' -print0 | xargs -0 cat | sed '/^\s*$/d' ) | wc -l` : adding `sed '/^\s*$/d'` will ignore all the blank lines in file. – yusong Jan 22 '18 at 21:48
  • 1
    @Wisienkas Not true, it uses next to no memory. Each file is streamed to `wc -l` . It is not every file being catted together then sent to `wc`. That is why we use xargs here. Remember, these things were made a long time ago when people had MUCH less memory. – Michael Papile Aug 08 '18 at 15:44
  • for counting java code comments: find ./ -name '*.java' -print0 | xargs -0 cat | grep '* ' | wc -l – Avi Chalbani May 17 '20 at 07:01
556

You can use the cloc utility which is built for this exact purpose. It reports each the amount of lines in each language, together with how many of them are comments, etc. CLOC is available on Linux, Mac and Windows.

Usage and output example:

$ cloc --exclude-lang=DTD,Lua,make,Python .
    2570 text files.
    2200 unique files.
    8654 files ignored.

http://cloc.sourceforge.net v 1.53  T=8.0 s (202.4 files/s, 99198.6 lines/s)
-------------------------------------------------------------------------------
Language                     files          blank        comment           code
-------------------------------------------------------------------------------
JavaScript                    1506          77848         212000         366495
CSS                             56           9671          20147          87695
HTML                            51           1409            151           7480
XML                              6           3088           1383           6222
-------------------------------------------------------------------------------
SUM:                          1619          92016         233681         467892
-------------------------------------------------------------------------------
Peter Mortensen
  • 30,738
  • 21
  • 105
  • 131
simao
  • 14,491
  • 9
  • 55
  • 66
  • 15
    That's a lovely tool that runs nice and quickly giving useful stats at the end. Love it. – Rob Forrest Jun 15 '12 at 13:23
  • 4
    Note that you can run Unix commands on Windows using cygwin (or other similar ports/environments). To me, having this kind of access so extremely useful, it's a necessity. A unix command line is magical. I especially like perl and regular expressions. – Curtis Yallop May 23 '14 at 20:51
  • CLOC and SLOCCount work fine on mid 2015 macbook. Note their numbers are close but not exactly the same for 127k Java Android project. Also note the iOS equivalent had 2x the LoC; so, the "cost" metric in SLOCCount might be off (or maybe iOS dev make 2x what Android dev make. :-) – maxweber Jun 23 '15 at 14:19
  • 3
    Would you consider editing the beginning of this question to make it clear that `cloc` is cross-platform since it's just a Perl script? – Kyle Strand Jun 08 '16 at 22:27
  • 1
    Just perfect, works fine in Windows bash as well of course. – yurisnm Apr 15 '19 at 18:14
  • 2
    I didn't have that tool installed, but with Node installed you can go `npx cloc myApp` to run it without installing globally – Marcus Hammarberg Jun 13 '19 at 07:28
  • Note, that `cloc` is not available as a cygwin package. You have to, instead, download the exe version here https://github.com/AlDanial/cloc. You can then run `cloc` in your cygwin shell provided you convert unix path string to windowish format. – daparic Jun 17 '19 at 14:03
  • Only issue is, `cloc` is, (being written in Perl) **excruciatingly slow** compared to `wc`. – rustyx Feb 22 '21 at 15:58
  • I like CLOC but I'd love to see a word count feature. – Alper Apr 27 '23 at 10:13
  • @CurtisYallop Nowadays [MSYS2](https://www.msys2.org/) would be preferable instead of Cygwin. [Apparently its package repo has `cloc` too!](https://packages.msys2.org/base/cloc) – underscore_d Jun 08 '23 at 08:27
453

If using a decently recent version of Bash (or ZSH), it's much simpler:

wc -l **/*.php

In the Bash shell this requires the globstar option to be set, otherwise the ** glob-operator is not recursive. To enable this setting, issue

shopt -s globstar

To make this permanent, add it to one of the initialization files (~/.bashrc, ~/.bash_profile etc.).

Michael Wild
  • 24,977
  • 3
  • 43
  • 43
  • 11
    I am upvoting this for simplicity, however I just want to point out that it doesn't appear to search the directories recursively, it only checks the subdirectories of the current directory. This is on SL6.3. – Godric Seer Apr 16 '13 at 01:44
  • 10
    That depends on your shell and the options you have set. Bash requires [`globstar` to be set](http://www.gnu.org/software/bash/manual/bashref.html#Pattern-Matching) for this to work. – Michael Wild Apr 16 '13 at 05:52
  • @MichaelWild, and what if I need more lines of code than this can handle? It is returning a very low value for the Linux Kernel... – Peter Jun 27 '13 at 00:14
  • 2
    @PeterSenna, with the current 3.9.8 kernel archive, the command `wc -l **/*.[ch]` finds a total of 15195373 lines. Not sure whether you consider that to be a "very low value". Again, you need to make sure that you have `globstar` enabled in Bash. You can check with `shopt globstar`. To enable it explicitly, do `shopt -s globstar`. – Michael Wild Jun 28 '13 at 08:06
  • @Sheena Have you actually read the comments? If you enable the `gobstart` shell option, it *is* recursive. – Michael Wild Aug 05 '13 at 06:15
  • 6
    @MichaelWild This is a good solution, but it will still overflow `ARG_MAX` if you have a large number of `.php` files, since `wc` is not builtin. – Reinstate Monica Please Jul 27 '14 at 20:48
  • It seems the result is larger than the accepted answer. Anyone know why? – AlbertSamuel May 24 '16 at 02:16
  • 1
    @AlbertSamuel No, you'd need to compare the list of files produced by both methods. My method has the problem of not working for large numbers of files, as mentioned by @BroSlow. The accepted answer will fail if the paths produced by `find` contain spaces. That could be fixed by using `print0` and `--null` with the `find` and `xargs` calls, respectively. – Michael Wild May 24 '16 at 05:01
  • If you are on a Mac you have to install a recent version of Bash to set globstar in shopt . By default MacOS has version 3.2 installed. But globstar is available from version 4. Another issue you might encounter is that your IDE (in my case PhpStorm) is still using the old (MacOS' native) Bash version. So you have to change the "Shell Path" in Preferences>Tools>Terminal from `/bin/bash` to `/usr/local/bin/bash` or whatever the path to the new Bash version is – FullStack Alex Apr 28 '20 at 17:08
  • 2
    If you want to include multiple file types (via extensions) you can extend this and use bash expansion: ```bash wc -l **/*.{py,yml,md,js,html} ``` – geogeo May 14 '22 at 05:20
117

On Unix-like systems, there is a tool called cloc which provides code statistics.

I ran in on a random directory in our code base it says:

      59 text files.
      56 unique files.
       5 files ignored.

http://cloc.sourceforge.net v 1.53  T=0.5 s (108.0 files/s, 50180.0 lines/s)
-------------------------------------------------------------------------------
Language                     files          blank        comment           code
-------------------------------------------------------------------------------
C                               36           3060           1431          16359
C/C++ Header                    16            689            393           3032
make                             1             17              9             54
Teamcenter def                   1             10              0             36
-------------------------------------------------------------------------------
SUM:                            54           3776           1833          19481
-------------------------------------------------------------------------------
Peter Mortensen
  • 30,738
  • 21
  • 105
  • 131
Calmarius
  • 18,570
  • 18
  • 110
  • 157
46

You didn't specify how many files are there or what is the desired output.

This may be what you are looking for:

find . -name '*.php' | xargs wc -l
Peter Mortensen
  • 30,738
  • 21
  • 105
  • 131
Paweł Polewicz
  • 3,711
  • 2
  • 20
  • 24
  • 5
    This will work, as long as there are not too many files : if there are a lot of files, you will get several lines as a result (xargs will split the files list in several sub-lists) – Pascal MARTIN Aug 31 '09 at 17:50
  • ah, yes. That's why I said He didn't specify how many files are there. My version is easier to remember, but Shin's version is better if You have more than a few files. I'm voting it up. – Paweł Polewicz Mar 18 '10 at 18:53
  • I needed to adapt this for use in a function, where single quotes are too restrictive: `go () { mkdir /tmp/go; [[ -f ./"$1" ]] && mv ./"$1" /tmp/go; (find ./ -type f -name "$*" -print0 | xargs -0 cat ) | wc -l; wc -l /tmp/go/*; mv /tmp/go/* . }` Results were close to slocount for `*.py`, but it didn't know `*.js`, `*.html`. – jalanb Jan 25 '16 at 19:09
43

Yet another variation :)

$ find . -name '*.php' | xargs cat | wc -l

This will give the total sum, instead of file-by-file.

Add . after find to make it work.

Peter Mortensen
  • 30,738
  • 21
  • 105
  • 131
Motiejus Jakštys
  • 2,769
  • 2
  • 25
  • 30
  • At least in cygwin, I had better results with: `$ find -name \*\.php -print0 | xargs -0 cat | wc -l` – Martin Haeberli Dec 12 '14 at 22:04
  • on Darwin, this just gives a grand total: `find . -name '*.php' | xargs cat | wc -l` ... whereas this gives file-by-file and a grand total: `find . -name '*.php' | xargs wc -l` – OsamaBinLogin Mar 19 '16 at 00:14
35

Use find's -exec and awk. Here we go:

find . -type f -exec wc -l {} \; | awk '{ SUM += $0} END { print SUM }'

This snippet finds for all files (-type f). To find by file extension, use -name:

find . -name '*.py' -exec wc -l '{}' \; | awk '{ SUM += $0; } END { print SUM; }'
Peter Mortensen
  • 30,738
  • 21
  • 105
  • 131
jonhattan
  • 490
  • 5
  • 9
  • 2
    Functionally, this works perfectly, but on large listing (linux source) it is really slow because it's starting a wc process for each file instead of 1 wc process for all the files. I timed it at 31 seconds using this method compared to 1.5 seconds using `find . -name '*.c' -print0 |xargs -0 wc -l`. That said, this faster method (at least on OS X), ends up printing "total" several times so some additional filtering is required to get a proper total (I posted details in my answer). – Doug Richardson Sep 09 '14 at 18:35
  • This has the benefit of working for an unlimited number of files. Well done! – ekscrypto Oct 25 '16 at 01:50
  • 1
    this is far better solution once working with large amount of GB and files. doing one `wc` on a form of a `cat` is slow because the system first must process all GB to start counting the lines (tested with 200GB of jsons, 12k files). doing `wc` first then counting the result is far faster – ulkas Nov 07 '18 at 08:03
  • 2
    @DougRichardson, you could consider this instead: `find . -type f -exec wc -l {} \+` or `find . -name '*.py' -type f -exec wc -l {} \+` which prints a total at the end of the output. If all you're interested in is the total, then you could go a bit further and use `tail`: `find . -type f -exec wc -l {} \+ | tail -1` or `find . -name '*.py' -type f -exec wc -l {} \+ | tail -1` – JamieJag Mar 20 '19 at 12:27
31

The tool Tokei displays statistics about code in a directory. Tokei will show the number of files, total lines within those files and code, comments, and blanks grouped by language. Tokei is also available on Mac, Linux, and Windows.

An example of the output of Tokei is as follows:

$ tokei
-------------------------------------------------------------------------------
 Language            Files        Lines         Code     Comments       Blanks
-------------------------------------------------------------------------------
 CSS                     2           12           12            0            0
 JavaScript              1          435          404            0           31
 JSON                    3          178          178            0            0
 Markdown                1            9            9            0            0
 Rust                   10          408          259           84           65
 TOML                    3           69           41           17           11
 YAML                    1           30           25            0            5
-------------------------------------------------------------------------------
 Total                  21         1141          928          101          112
-------------------------------------------------------------------------------

Tokei can be installed by following the instructions on the README file in the repository.

Peter Mortensen
  • 30,738
  • 21
  • 105
  • 131
Joel Ellis
  • 1,332
  • 1
  • 12
  • 31
28

More common and simple as for me, suppose you need to count files of different name extensions (say, also natives):

wc $(find . -type f | egrep "\.(h|c|cpp|php|cc)" )
Peter Mortensen
  • 30,738
  • 21
  • 105
  • 131
sergeych
  • 791
  • 7
  • 11
  • 6
    this does not do quite what you think. find . -name '*.[am]' is identical to find . -name '*.[a|m]' both will find all files that ends with .m or .a – Omry Yadan Dec 18 '13 at 18:24
  • 1
    but the second will also find files ending in .| , if any. So [h|c|cpp|php|cc] ends up being the same as [hcp|] . – OsamaBinLogin Mar 19 '16 at 00:10
  • backticks are deprecated, prefer `$()` – Sandburg Jun 20 '19 at 09:06
  • This works under Cygwin. Of course, the "C:\" drive has to follow the cygwin convention, like for example: wc $(find /cygdrive/c//SomeWindowsFolderj/ -type f | egrep "\.(h|c|cpp|php|cc)" ) – Christian Gingras Dec 21 '19 at 20:34
25

POSIX

Unlike most other answers here, these work on any POSIX system, for any number of files, and with any file names (except where noted).


Lines in each file:

find . -name '*.php' -type f -exec wc -l {} \;
# faster, but includes total at end if there are multiple files
find . -name '*.php' -type f -exec wc -l {} +

Lines in each file, sorted by file path

find . -name '*.php' -type f | sort | xargs -L1 wc -l
# for files with spaces or newlines, use the non-standard sort -z
find . -name '*.php' -type f -print0 | sort -z | xargs -0 -L1 wc -l

Lines in each file, sorted by number of lines, descending

find . -name '*.php' -type f -exec wc -l {} \; | sort -nr
# faster, but includes total at end if there are multiple files
find . -name '*.php' -type f -exec wc -l {} + | sort -nr

Total lines in all files

find . -name '*.php' -type f -exec cat {} + | wc -l
Paul Draper
  • 78,542
  • 46
  • 206
  • 285
24

There is a little tool called sloccount to count the lines of code in a directory.

It should be noted that it does more than you want as it ignores empty lines/comments, groups the results per programming language and calculates some statistics.

John Bachir
  • 22,495
  • 29
  • 154
  • 227
sebasgo
  • 3,845
  • 23
  • 28
15

You want a simple for loop:

total_count=0
for file in $(find . -name *.php -print)
do
    count=$(wc -l $file)
    let total_count+=count
done
echo "$total_count"
Peter Mortensen
  • 30,738
  • 21
  • 105
  • 131
ennuikiller
  • 46,381
  • 14
  • 112
  • 137
  • 3
    isn't this overkill compared to the answers that suggest `xargs`? – Nathan Fellman Aug 31 '09 at 17:52
  • 7
    No, Nathan. The xargs answers won't necessarily print the count as a single number. It may just print a bunch of subtotals. – Rob Kennedy Aug 31 '09 at 18:10
  • 4
    what will this program do if file names contain spaces? What about newlines? ;-) – Paweł Polewicz Aug 31 '09 at 20:05
  • 44
    If your file names contain new lines, I'd say you have bigger problems. – Kzqai Aug 31 '12 at 18:23
  • 2
    @ennuikiller Number of issues with this, first of all it will break on files with whitespaces. Setting `IFS=$'\n'` before the loop would at least fix it for all but files with newlines in their names. Second, you're not quoting `'*.php'`, so it will get expanded by the shell and not `find`, and ergo won't actually find any of the php files in subdirectories. Also the `-print` is redundant, since it's implied in the absence of other actions. – Reinstate Monica Please Jul 27 '14 at 21:02
  • I would change line 4 to: count=$(wc -l < $file) in order to suppress the syntax errors, otherwise this works as an alternative although I agree it's a bit overkill. – tylermoseley May 25 '16 at 16:32
13

For sources only:

wc `find`

To filter, just use grep:

wc `find | grep .php$`
Peter Mortensen
  • 30,738
  • 21
  • 105
  • 131
kekszumquadrat
  • 131
  • 1
  • 2
12

A straightforward one that will be fast, will use all the search/filtering power of find, not fail when there are too many files (number arguments overflow), work fine with files with funny symbols in their name, without using xargs, and will not launch a uselessly high number of external commands (thanks to + for find's -exec). Here you go:

find . -name '*.php' -type f -exec cat -- {} + | wc -l
Peter Mortensen
  • 30,738
  • 21
  • 105
  • 131
gniourf_gniourf
  • 44,650
  • 9
  • 93
  • 104
  • 2
    I was about to post a variant of this myself (with `\;` instead of `+` as I wasn't aware of it), this answer should be the correct answer. – Mark K Cowan Aug 03 '15 at 11:26
  • I did ( find . -type f -exec cat {} \; |wc -l ) then I saw this. Just wondering what '--' and '+' in this solution mean and the difference to my version regarding number of external commands. – grenix Nov 18 '21 at 13:27
  • @grenix: your version will spawn a new `cat` for each file found, whereas the `\+` version will give all the files found to `cat` in one call. The `--` is to mark the end of options (it's a bit unnecessary here). – gniourf_gniourf Nov 18 '21 at 20:11
  • what I do not understand is how this avoids number of arguments overflow. If I do 'find . -type f -exec cat -- {} + |more' and ' ps aux|grep "cat "' in another terminal I get somthing like '... 66128 0.0 0.0 7940 2020 pts/10 S+ 13:45 0:00 cat -- ./file1 ./file2 ...' – grenix Nov 26 '21 at 12:46
10

None of the answers so far gets at the problem of filenames with spaces.

Additionally, all that use xargs are subject to fail if the total length of paths in the tree exceeds the shell environment size limit (defaults to a few megabytes in Linux).

Here is one that fixes these problems in a pretty direct manner. The subshell takes care of files with spaces. The awk totals the stream of individual file wc outputs, so it ought never to run out of space. It also restricts the exec to files only (skipping directories):

find . -type f -name '*.php' -exec bash -c 'wc -l "$0"' {} \; | awk '{s+=$1} END {print s}'
Peter Mortensen
  • 30,738
  • 21
  • 105
  • 131
Gene
  • 46,253
  • 4
  • 58
  • 96
9

I know the question is tagged as , but it seems that the problem you're trying to solve is also PHP related.

Sebastian Bergmann wrote a tool called PHPLOC that does what you want and on top of that provides you with an overview of a project's complexity. This is an example of its report:

Size
  Lines of Code (LOC)                            29047
  Comment Lines of Code (CLOC)                   14022 (48.27%)
  Non-Comment Lines of Code (NCLOC)              15025 (51.73%)
  Logical Lines of Code (LLOC)                    3484 (11.99%)
    Classes                                       3314 (95.12%)
      Average Class Length                          29
      Average Method Length                          4
    Functions                                      153 (4.39%)
      Average Function Length                        1
    Not in classes or functions                     17 (0.49%)

Complexity
  Cyclomatic Complexity / LLOC                    0.51
  Cyclomatic Complexity / Number of Methods       3.37

As you can see, the information provided is a lot more useful from the perspective of a developer, because it can roughly tell you how complex a project is before you start working with it.

Ja͢ck
  • 170,779
  • 38
  • 263
  • 309
9

If you want to keep it simple, cut out the middleman and just call wc with all the filenames:

wc -l `find . -name "*.php"`

Or in the modern syntax:

wc -l $(find . -name "*.php")

This works as long as there are no spaces in any of the directory names or filenames. And as long as you don't have tens of thousands of files (modern shells support really long command lines). Your project has 74 files, so you've got plenty of room to grow.

alexis
  • 48,685
  • 16
  • 101
  • 161
  • I like this one! If you are in hybrid C/C++ environment: ```wc -l `find . -type f \( -name "*.cpp" -o -name "*.c" -o -name "*.h" \) -print` ``` – Bram Jul 20 '16 at 03:55
7

WC -L ? better use GREP -C ^

wc -l? Wrong!

The wc command counts new lines codes, not lines! When the last line in the file does not end with new line code, this will not be counted!

If you still want count lines, use grep -c ^. Full example:

# This example prints line count for all found files
total=0
find /path -type f -name "*.php" | while read FILE; do
     # You see, use 'grep' instead of 'wc'! for properly counting
     count=$(grep -c ^ < "$FILE")
     echo "$FILE has $count lines"
     let total=total+count #in bash, you can convert this for another shell
done
echo TOTAL LINES COUNTED:  $total

Finally, watch out for the wc -l trap (counts enters, not lines!!!)

Peter Mortensen
  • 30,738
  • 21
  • 105
  • 131
Znik
  • 1,047
  • 12
  • 17
  • Please read the [POSIX definition of a line](http://pubs.opengroup.org/onlinepubs/009695399/basedefs/xbd_chap03.html#tag_03_205). With `grep -c ^` you're counting the number of [incomplete lines](http://pubs.opengroup.org/onlinepubs/009695399/basedefs/xbd_chap03.html#tag_03_194), and such incomplete lines cannot appear in a [text file](http://pubs.opengroup.org/onlinepubs/009695399/basedefs/xbd_chap03.html#tag_03_392). – gniourf_gniourf Feb 05 '15 at 10:13
  • 2
    I know it. In practice only last line can be incomplete because it hasn't got EOL. Idea is counting all lines including incomplete one. It is very frequent mistake, counting only complete lines. after counting we are thinking "why I missed last line???". This is answer why, and recipe how to do it properly. – Znik Feb 19 '15 at 13:31
  • Or, if you want a one liner: `find -type f -name '*.php' -print0 | xargs -0 grep -ch ^ | paste -sd+ - | bc` See here for alternatives to `bc`: https://stackoverflow.com/q/926069/2400328 – techniao Dec 14 '19 at 07:40
6

bash tools are always nice to use, but for this purpose it seems to be more efficient to just use a tool that does that. I played with some of the main ones as of 2022, namely cloc (perl), gocloc (go), pygount (python).

Got various results without tweaking them too much. Seems the most accurate and blazingly fast is gocloc.

Example on a small laravel project with a vue frontend:

gocloc

$ ~/go/bin/gocloc /home/jmeyo/project/sequasa
-------------------------------------------------------------------------------
Language                     files          blank        comment           code
-------------------------------------------------------------------------------
JSON                             5              0              0          16800
Vue                             96           1181            137           8993
JavaScript                      37            999            454           7604
PHP                            228           1493           2622           7290
CSS                              2            157             44            877
Sass                             5             72            426            466
XML                             11              0              2            362
Markdown                         2             45              0            111
YAML                             1              0              0             13
Plain Text                       1              0              0              2
-------------------------------------------------------------------------------
TOTAL                          388           3947           3685          42518
-------------------------------------------------------------------------------

cloc

$ cloc /home/jmeyo/project/sequasa
     450 text files.
     433 unique files.                                          
      40 files ignored.

github.com/AlDanial/cloc v 1.90  T=0.24 s (1709.7 files/s, 211837.9 lines/s)
-------------------------------------------------------------------------------
Language                     files          blank        comment           code
-------------------------------------------------------------------------------
JSON                             5              0              0          16800
Vuejs Component                 95           1181            370           8760
JavaScript                      37            999            371           7687
PHP                            180           1313           2600           5321
Blade                           48            180            187           1804
SVG                             27              0              0           1273
CSS                              2            157             44            877
XML                             12              0              2            522
Sass                             5             72            418            474
Markdown                         2             45              0            111
YAML                             4             11             37             53
-------------------------------------------------------------------------------
SUM:                           417           3958           4029          43682
-------------------------------------------------------------------------------

pygcount

$ pygount --format=summary /home/jmeyo/project/sequasa
┏━━━━━━━━━━━━━━━┳━━━━━━━┳━━━━━━━┳━━━━━━━┳━━━━━━━┳━━━━━━━━━┳━━━━━━┓
┃ Language      ┃ Files ┃     % ┃  Code ┃     % ┃ Comment ┃    % ┃
┡━━━━━━━━━━━━━━━╇━━━━━━━╇━━━━━━━╇━━━━━━━╇━━━━━━━╇━━━━━━━━━╇━━━━━━┩
│ JSON          │     5 │   1.0 │ 12760 │  76.0 │       0 │  0.0 │
│ PHP           │   182 │  37.1 │  4052 │  43.8 │    1288 │ 13.9 │
│ JavaScript    │    37 │   7.5 │  3654 │  40.4 │     377 │  4.2 │
│ XML+PHP       │    43 │   8.8 │  1696 │  89.6 │      39 │  2.1 │
│ CSS+Lasso     │     2 │   0.4 │   702 │  65.2 │      44 │  4.1 │
│ SCSS          │     5 │   1.0 │   368 │  38.2 │     419 │ 43.5 │
│ HTML+PHP      │     2 │   0.4 │   171 │  85.5 │       0 │  0.0 │
│ Markdown      │     2 │   0.4 │    86 │  55.1 │       4 │  2.6 │
│ XML           │     1 │   0.2 │    29 │  93.5 │       2 │  6.5 │
│ Text only     │     1 │   0.2 │     2 │ 100.0 │       0 │  0.0 │
│ __unknown__   │   132 │  26.9 │     0 │   0.0 │       0 │  0.0 │
│ __empty__     │     6 │   1.2 │     0 │   0.0 │       0 │  0.0 │
│ __duplicate__ │     6 │   1.2 │     0 │   0.0 │       0 │  0.0 │
│ __binary__    │    67 │  13.6 │     0 │   0.0 │       0 │  0.0 │
├───────────────┼───────┼───────┼───────┼───────┼─────────┼──────┤
│ Sum           │   491 │ 100.0 │ 23520 │  59.7 │    2173 │  5.5 │
└───────────────┴───────┴───────┴───────┴───────┴─────────┴──────┘

Results are mixed, the closest to the reality seems to be the gocloc one, and is also by far the fastest:

  • cloc: 0m0.430s
  • gocloc: 0m0.059s
  • pygcount: 0m39.980s
Jean-Christophe Meillaud
  • 1,961
  • 1
  • 21
  • 27
5

Giving out the longest files first (ie. maybe these long files need some refactoring love?), and excluding some vendor directories:

 find . -name '*.php' | xargs wc -l | sort -nr | egrep -v "libs|tmp|tests|vendor" | less
Matt
  • 161
  • 2
  • 3
5

For Windows, an easy-and-quick tool is LocMetrics.

Peter Mortensen
  • 30,738
  • 21
  • 105
  • 131
walv
  • 2,680
  • 3
  • 31
  • 36
  • It's pretty unlikely OP is on Windows if they're using bash. –  Mar 01 '18 at 03:14
  • 1
    @VanessaMcHale question title and description both don't clearly require unix-only solution. So Windows based solution is acceptable. Also Google pointed me to this page when I was looking for similar solution. – walv Mar 06 '18 at 00:42
  • This comment helped me. I tried this and it works well. – Allan F Jan 31 '20 at 00:32
5

You can use a utility called codel (link). It's a simple Python module to count lines with colorful formatting.

Installation

pip install codel

Usage

To count lines of C++ files (with .cpp and .h extensions), use:

codel count -e .cpp .h

You can also ignore some files/folder with the .gitignore format:

codel count -e .py -i tests/**

It will ignore all the files in the tests/ folder.

The output looks like:

Long output

You also can shorten the output with the -s flag. It will hide the information of each file and show only information about each extension. The example is below:

Short output

Peter Mortensen
  • 30,738
  • 21
  • 105
  • 131
voilalex
  • 2,041
  • 2
  • 13
  • 18
4

Very simply:

find /path -type f -name "*.php" | while read FILE
do
    count=$(wc -l < $FILE)
    echo "$FILE has $count lines"
done
Peter Mortensen
  • 30,738
  • 21
  • 105
  • 131
ghostdog74
  • 327,991
  • 56
  • 259
  • 343
4

If you want your results sorted by number of lines, you can just add | sort or | sort -r (-r for descending order) to the first answer, like so:

find . -name '*.php' | xargs wc -l | sort -r
Paul Pettengill
  • 4,843
  • 1
  • 29
  • 33
4

Something different:

wc -l `tree -if --noreport | grep -e'\.php$'`

This works out fine, but you need to have at least one *.php file in the current folder or one of its subfolders, or else wc stalls.

Peter Mortensen
  • 30,738
  • 21
  • 105
  • 131
nav
  • 1,645
  • 15
  • 22
4

It’s very easy with Z shell (zsh) globs:

wc -l ./**/*.php

If you are using Bash, you just need to upgrade. There is absolutely no reason to use Bash.

Peter Mortensen
  • 30,738
  • 21
  • 105
  • 131
HappyFace
  • 3,439
  • 2
  • 24
  • 43
4

If the files are too many, better to just look for the total line count.

find . -name '*.php' | xargs wc -l | grep -i ' total' | awk '{print $1}'
bharath
  • 481
  • 4
  • 10
3

On OS X at least, the find+xarg+wc commands listed in some of the other answers prints "total" several times on large listings, and there is no complete total given. I was able to get a single total for .c files using the following command:

find . -name '*.c' -print0 |xargs -0 wc -l|grep -v total|awk '{ sum += $1; } END { print "SUM: " sum; }'

Doug Richardson
  • 10,483
  • 6
  • 51
  • 77
  • Instead of `grep -v total` you can use `grep total` - which will sum the intermediate sums given by `wc`. It doesn't make sense to re-calculate intermediate sums since `wc` already did it. – Stanislav Bashkyrtsev Dec 25 '21 at 13:18
3

While I like the scripts, I prefer this one as it also shows a per-file summary as long as a total:

wc -l `find . -name "*.php"`
Peter Mortensen
  • 30,738
  • 21
  • 105
  • 131
akiva
  • 2,677
  • 3
  • 31
  • 40
3

I wanted to check on multiple file types and was to lazy to calculate the total by hand. So I use this now to get the total in one go.

find . -name '*.js' -or -name '*.php' | xargs wc -l | grep 'total'  | awk '{ SUM += $1; print $1} END { print "Total text lines in PHP and JS",SUM }'

79351
15318
Total text lines in PHP and JS 94669

This allows you to chain multiple extension types you wish to filter on. Just add them in the -name '*.js' -or -name '*.php' part, and possibly modify the otuput message to your liking.

Peter Mortensen
  • 30,738
  • 21
  • 105
  • 131
Tschallacka
  • 27,901
  • 14
  • 88
  • 133
  • 1
    good hack with awk and no need to install additional tools, thank you! Also, if you want to skip the folders and don't specify file extensions (count in all files), then `-type f` will help here: `find . -type f | xargs wc -l | grep 'total' | awk '{ SUM += $1; print $1} END { print "Total text lines: ",SUM }'` – Sysanin Oct 11 '19 at 15:49
3

You don't need all these complicated and hard to remember commands. You just need a Python tool called line-counter.

A quick overview

This is how you get the tool

$ pip install line-counter

Use the line command to get the file count and line count under current directory (recursively):

$ line
Search in /Users/Morgan/Documents/Example/
file count: 4
line count: 839

If you want more detail, just use line -d.

$ line -d
Search in /Users/Morgan/Documents/Example/
Dir A/file C.c                                             72
Dir A/file D.py                                           268
file A.py                                                 467
file B.c                                                   32
file count: 4
line count: 839

And the best part of this tool is, you can add a .gitignore-like configuration file to it. You can set up rules to select or ignore what kind of files to count just like what you do in '.gitignore'.

More description and usage is here: https://github.com/MorganZhang100/line-counter

Peter Mortensen
  • 30,738
  • 21
  • 105
  • 131
Morgan Zhang
  • 93
  • 1
  • 4
3

If you're on Linux (and I take it you are), I recommend my tool polyglot. It is dramatically faster than either sloccount or cloc and it is more featureful than sloccount.

You can invoke it with

poly .

or

poly

so it's much more user-friendly than some convoluted Bash script.

Peter Mortensen
  • 30,738
  • 21
  • 105
  • 131
3

If you need just the total number of lines in, let's say, your PHP files, you can use very simple one line command even under Windows if you have GnuWin32 installed. Like this:

cat `/gnuwin32/bin/find.exe . -name *.php` | wc -l

You need to specify where exactly is the find.exe otherwise the Windows provided FIND.EXE (from the old DOS-like commands) will be executed, since it is probably before the GnuWin32 in the environment PATH and has different parameters and results.

Please note that in the command above you should use back-quotes, not single quotes.

Peter Mortensen
  • 30,738
  • 21
  • 105
  • 131
Neven Boyanov
  • 749
  • 8
  • 14
  • In the example above I'm using the bash for windows instead of the cmd.exe that's why there are forward slashes "/" and not back slashes "\". – Neven Boyanov Jul 05 '11 at 08:26
3

Similar to Shizzmo's answer, but uglier and more accurate. If you're using it often, modify it to suit and put it in a script.

This example:

  1. Properly excludes paths that aren't your code (not traversed at all by find)
  2. Filters out compound extensions and other files you wish to ignore
  3. Only includes actual files of the types you specify
  4. Ignores blank lines
  5. Gives a single number as a total
find . \! \( \( -path ./lib -o -path ./node_modules -o -path ./vendor -o -path ./any/other/path/to/skip -o -wholename ./not/this/specific/file.php -o -name '*.min.js' -o -name '*.min.css' \) -prune \) -type f \( -name '*.php' -o -name '*.inc' -o -name '*.js' -o -name '*.scss' -o -name '*.css' \) -print0 | xargs -0 cat | grep -vcE '^[[:space:]]*$'
Walf
  • 8,535
  • 2
  • 44
  • 59
  • Is it possible to do this for all text files, not just ones with specific extensions? – Aaron Franke Oct 01 '20 at 07:04
  • @AaronFranke That should be a lot easier, try: `grep -rvcIE '^[[:space:]]*$'` The `r` flag means search recursively, and the `I` flag means ignore binary files. – Walf Oct 01 '20 at 08:06
2
$cd directory
$wc -l* | sort -nr
mishik
  • 9,973
  • 9
  • 45
  • 67
uss
  • 1,271
  • 1
  • 13
  • 29
2

I used this inline-script that I launch from on a source project's directory:

 for i in $(find . -type f); do rowline=$(wc -l $i | cut -f1 -d" "); file=$(wc -l $i | cut -f2 -d" "); lines=$((lines + rowline)); echo "Lines["$lines"] " $file "has "$rowline"rows."; done && unset lines

That produces this output:

Lines[75]  ./Db.h has 75rows.
Lines[143]  ./Db.cpp has 68rows.
Lines[170]  ./main.cpp has 27rows.
Lines[294]  ./Sqlite.cpp has 124rows.
Lines[349]  ./Sqlite.h has 55rows.
Lines[445]  ./Table.cpp has 96rows.
Lines[480]  ./DbError.cpp has 35rows.
Lines[521]  ./DbError.h has 41rows.
Lines[627]  ./QueryResult.cpp has 106rows.
Lines[717]  ./QueryResult.h has 90rows.
Lines[828]  ./Table.h has 111rows.
Peter Mortensen
  • 30,738
  • 21
  • 105
  • 131
Luca Davanzo
  • 21,000
  • 15
  • 120
  • 146
2

Excluding blank lines:

find . -name "*.php" | xargs grep -v -c '^$' | awk 'BEGIN {FS=":"} { $cnt = $cnt + $2} END {print $cnt}'

Including blank lines:

find . -name "*.php" | xargs wc -l
Peter Mortensen
  • 30,738
  • 21
  • 105
  • 131
2

First change the directory to which you want to know the number of lines.

For example, if I want to know the number of lines in all files of a directory named sample, give $cd sample.

Then try the command $wc -l *. This will return the number of lines for each file and also the total number of lines in the entire directory at the end.

Peter Mortensen
  • 30,738
  • 21
  • 105
  • 131
venky513
  • 321
  • 2
  • 15
2

I do it like this:

Here is the lineCount.c file implementation:

#include <stdio.h>
#include <string.h>
#include <stdlib.h>

int getLinesFromFile(const char*);

int main(int argc, char* argv[]) {
   int total_lines = 0;
   for(int i = 1; i < argc; ++i) {
       total_lines += getLinesFromFile(argv[i]); // *argv is a char*
   }

   printf("You have a total of %d lines in all your file(s)\n",    total_lines);
   return 0;
}


int getLinesFromFile(const char* file_name) {
    int lines = 0;
    FILE* file;
    file = fopen(file_name, "r");
    char c = ' ';
    while((c = getc(file)) != EOF)
        if(c == '\n')
            ++lines;
    fclose(file);
    return lines;
}

Now open the command line and type gcc lineCount.c. Then type ./a.out *.txt.

This will display the total lines of files ending with .txt in your directory.

Peter Mortensen
  • 30,738
  • 21
  • 105
  • 131
Moshe Rabaev
  • 1,892
  • 16
  • 31
2

Here's a flexible one using older Python (works in at least Python 2.6) incorporating Shizzmo's lovely one-liner. Just fill in the types list with the filetypes you want counted in the source folder, and let it fly:

#!/usr/bin/python

import subprocess

rcmd = "( find ./ -name '*.%s' -print0 | xargs -0 cat ) | wc -l"
types = ['c','cpp','h','txt']

sum = 0
for el in types:
    cmd = rcmd % (el)
    p = subprocess.Popen([cmd],stdout=subprocess.PIPE,shell=True)
    out = p.stdout.read().strip()
    print "*.%s: %s" % (el,out)
    sum += int(out)
print "sum: %d" % (sum)
Peter Mortensen
  • 30,738
  • 21
  • 105
  • 131
fyngyrz
  • 2,458
  • 2
  • 36
  • 43
2

If you want to count LOC you have written, you may need to exclude some files.

For a Django project, you may want to ignore the migrations and static folders. For a JavaScript project, you may exclude all pictures or all fonts.

find . \( -path '*/migrations' -o -path '*/.git' -o -path '*/.vscode' -o -path '*/fonts' -o -path '*.png' -o -path '*.jpg' -o -path '*/.github' -o -path '*/static' \) -prune -o -type f -exec cat {} + | wc -l

Usage here is as follows:

*/folder_name
*/.file_extension

To list the files, modify the latter part of the command:

find . \( -path '*/migrations' -o -path '*/.git' -o -path '*/.vscode' -o -path '*/fonts' -o -path '*.png' -o -path '*.jpg' -o -path '*/.github' -o -path '*/static' \) -prune -o --print
Peter Mortensen
  • 30,738
  • 21
  • 105
  • 131
Vinayak Bagaria
  • 162
  • 2
  • 6
1

I have BusyBox installed on my Windows system. So here is what I did.

ECHO OFF
for /r %%G in (*.php) do (
busybox grep . "%%G" | busybox wc -l
)
Peter Mortensen
  • 30,738
  • 21
  • 105
  • 131
shyam
  • 21
  • 4
1

Yet another command to get the sum of all files (Linux of course)

find ./ -type f -exec wc -l {}  \; | cut -d' ' -f1 | paste -sd+ | bc

Main difference from other answers:

  1. using find -exec,
  2. using paste (with cut),
  3. using bc
AJed
  • 578
  • 6
  • 11
  • I modified this a little in my own answer to work on OS X, but shouldn't this version also have a "-" as the last argument to paste so it takes input from stdin? Or does paste do that by default on Linux? – Kendall Helmstetter Gelner Feb 05 '15 at 09:10
  • It does that by default. Sorry, I didnt see that it is only for OS X. – AJed Feb 08 '15 at 21:04
  • The question was generic so your answer was great for Linux (and also I liked it enough to model my OS X variant on!), just wondered if paste was really a bit different on Linux or if that was a typo to fix. – Kendall Helmstetter Gelner Feb 08 '15 at 22:41
  • Oh now I understand your question! we're piping, so we don't need to use the - option. – AJed Feb 09 '15 at 02:09
  • It was because of the piping I had to use the "-" on OS X, to make it take input from stdin (otherwise it was looking for a file argument). It had been so long since I last used paste on Linux I didn't remember if some versions had the default be stdin... – Kendall Helmstetter Gelner Feb 09 '15 at 05:32
1

On Windows PowerShell try this:

dir -Recurse *.php | Get-Content | Measure-Object -Line
Yunus Eş
  • 159
  • 1
  • 3
0

I may as well add another OS X entry, this one using plain old find with exec (which I prefer over using xargs, as I have seen odd results from very large find result sets with xargs in the past).

Because this is for OS X, I also added in the filtering to either .h or .m files - make sure to copy all the way to the end!

find ./ -type f -name "*.[mh]" -exec wc -l {}  \; | sed -e 's/[ ]*//g' | cut -d"." -f1 | paste -sd+ - | bc
Peter Mortensen
  • 30,738
  • 21
  • 105
  • 131
Kendall Helmstetter Gelner
  • 74,769
  • 26
  • 128
  • 150
0
lines=0 ; for file in *.cpp *.h ; do lines=$(( $lines + $( wc -l $file | cut -d ' ' -f 1 ) )) ; done ; echo $lines
Alex
  • 98
  • 7
0

Here is a way to list the lines for each subdirectory individually:

for dir in `ls`; do echo $dir; find $dir -name '*.php' | xargs wc -l | grep total; done;

That outputs something like this:

my_dir1
     305 total
my_dir2
     108 total
my_dir3
    1438 total
my_dir4
   26496 total
flix
  • 1,821
  • 18
  • 23
-2
cat \`find . -name "*.php"\` | wc -l
Stu Thompson
  • 38,370
  • 19
  • 110
  • 156
-2

if u use window so easly 2 steps:

  1. install cloc for example open cmd for admin and write next code => choco install cloc
  2. then use cd or open terminal in folder with projects and write next code => cloc project-example

sreens with steps:

  1. enter image description here
  2. enter image description here

p.s. need move or remove folder with build project and node_modules

Paul Alexeev
  • 172
  • 1
  • 11
-2

You can use this windows power shell code to count any file type that you want :

 (gci -include *.cs,*.cshtml -recurse | select-string .).Count
Rostam Bamasi
  • 204
  • 1
  • 6