3

If I run Ctrl-P natively out of the box, it works 100% as I desire, except that with the size of the codebase I work with, it takes a very long time to index directories.

To make Ctrl-P cope with the project sizes I'm dealing with, I'm using the following (fairly popular) user_command setting in my .vimrc file to provide native helper-utilities to more quickly provide Ctrl-P with the list of files available:

if has("unix")
    let g:ctrlp_user_command = {
    \ 'types': {
        \ 1: ['.git/', 'cd %s && git ls-files']
    \ },
    \ 'fallback': 'find %s -type f | head -' . g:ctrlp_max_files
    \ }
endif

This approach makes indexing blazingly fast, but when configured this way Ctrl-P doesn't learn about the contents of git submodules the way that it did when running without helper programs (since 'git ls-files' doesn't recurse into submodules, while Ctrl-P's naive directory traversal does).

I've tried using 'find' to index git repositories as well, but when I do that I wind up indexing .git directories, object files, and all sorts of other things that Ctrl-P normally knows to ignore automatically; seems like providing a user_command completely supersedes the built-in logic about which files to ignore. I could probably hack together an inverse grep to remove certain elements, but it seemed like someone must have figured out a more elegant solution to this.

Is there another, perhaps cleverer way to get Ctrl-P to index all files within a git repository, including the files inside all its submodules, other than resorting to its slow built-in search?

glts
  • 21,808
  • 12
  • 73
  • 94
Trevor Powell
  • 1,154
  • 9
  • 20
  • [It looks](http://stackoverflow.com/questions/3115249/how-to-make-top-level-git-to-track-all-the-files-under-another-sub-directory-git) like Git doesn't track files in submodules. You must actively add them for `git ls-files` to show them. – romainl Feb 06 '13 at 08:39

4 Answers4

2

You could use find with options to filter out file you don't want like :

find . -type f \( -name "*.cpp" -or -name "*.h" \)

or removing .git directories :

find . -path '.git' -prune -o -print

but this is not a smart solution.

Xavier T.
  • 40,509
  • 10
  • 68
  • 97
  • The `-path '.git'` command doesn't work for me, because `find .` provides paths with "./" on the front of them. Either `find * -path '.git' -prune -o -print` (to exclude a '.git' directory at the top level), or `find . -name '.git' -prune -o -print` (to exclude '.git' files and directories anywhere within the search) will work. As you say, these aren't really *smart* solutions, but they'll do in a pinch, and the latter is what I'm actually using at the moment. – Trevor Powell Feb 06 '13 at 23:10
1

You could use ack, which skips VCS dirs like .git, .svn, .hg, etc.

I use ack a lot at work, it's great for global find and replace. I used find once, piping that into xargs with a perl one liner regex. Unfortunately, this completely destroyed code in my .git dir. I was unable to commit my changes, and unable to checkout, reset, or revert at that point. I had to checkout the codebase anew after that! Ack is significantly easier to use than find when trying to exclude directories, just use the --ignore-dir flag (or --noignore-dir to search in non default ones).

Nick Desaulniers
  • 2,046
  • 3
  • 25
  • 47
  • 2
    I don't think this is really relevant for listing files, but I agree that ack is a fantastic replacement for grep for programmers. I've recently switched from ack to [the silver searcher](https://github.com/ggreer/the_silver_searcher) ("ag"), which is even faster still.. and is probably about as fast as a 'grep' tool can get without pre-building an index the way that ctags does. Worth looking at, if you haven't already done so. – Trevor Powell Feb 23 '13 at 11:32
  • 3
    It is actually quite relevant. You can hook ag into CtrlP pretty easily. No useless files clogging up your matches with ridiculous speeds. Here's my config for it: https://github.com/Wolfy87/vim-config/blob/2d9ede6096fa07144f5ba71ac9dc8b5ed18067e3/bundles.vim#L113-L114 – Olical May 21 '13 at 20:02
1

I started using a variation on one of the other answers from this post, but had some performance issues due to all the calls to echo. My index resulted in about 65k files, and using this syntax git ls-files -co --exclude-standard | while read i; do echo \"\$path/\$i\"; done took nearly 3 seconds to generate. Converting to use sed saved me all of that time.

let g:ctrlp_user_command = {
  \ 'types': {
    \ 1: ['.git', 'cd %s && git ls-files -co --exclude-standard && git submodule foreach "git ls-files -co --exclude-standard | sed \"s#^#\$path/#\""'],
    \ 2: ['.hg', 'hg --cwd %s locate -I .'],
    \ },
  \ 'fallback': 'find %s -type f'
  \ }
Jason S
  • 11
  • 1
0

I use git-submodule-foreach to descend in submodules and run ls-files there (adding the submodule path to those file names):

let g:ctrlp_user_command = ['.git', 'cd %s && git ls-files -co --exclude-standard && git submodule foreach "git ls-files -co --exclude-standard | while read i; do echo \"\$path/\$i\"; done"']
remram
  • 4,805
  • 1
  • 29
  • 42