I found some workaround to decrease time against find
here. Instead of find
, locate
can be used. So the command goes like
locate -r '/home'"$USER"'.*\.git$'
-r
takes input a regular expression. Arguments to -r
here filters all git repositories inside /home/$USER
. This is a bit faster than using find
.
Catch using locate
locate
uses a local database for searching. So it will only work as expected when local database will be built/updated.
To update database, use sudo updatedb
. Whenever you add/move/delete a file (or a directory in this case), remember to update database for locate
to give proper results.
Tip
To avoid entering password every time for updatedb
(and other frequently used commands), add them to sudoers by executing sudo visudo
and adding entry for path to command's binary's location
Update
I recently realized why use locate
when I can simply maintain my own database and cat all the entries to dmenu
. With this I was able to achieve what I needed.
# Make a temp directory
mkdir -p $HOME/.tmp
# Search for all git directories and store them in ~/.tmp/gitfies.
[ -e $HOME/.tmp/gitfiles ] || find $HOME/ -regex .*/\.git$ -type d 2>/dev/null > $HOME/.tmp/gitfiles
# cat this file into dmenu
cat $HOME/.tmp/gitfiles | dmenu
This gives a fuzzy finding for directories with dmenu
. This is better than using locate
as even in locate
you need to update local database and so in here. Since we do the filtering of git files at runtime with locate
, it is a bit slower than this case.
I can simple create an alias to update this database analogous to sudo updatedb
in case of locate
, by
alias gitdbupdate="find $HOME/ -regex .*/\.git$ -type d 2>/dev/null > $HOME/.tmp/gitfiles"
Note that I am not using /tmp/
as it won't be persistent across power cycles. So rather I create my own $HOME/.tmp/
directory.