13

I have 30 instances of a process running on a server and want to log open files for each process for analysis.

I ran the following command:

* ps auwwx | grep PROG_NAME | awk '{print $2}' | xargs lsof -p | less

It complaints that, "lsof: status error on : No such file or directory"

However, if I run lsof -p < pid > it gives me the list of open files for that process . How can I get a list of all open files for all 30 instances of the process on a FreeBSD machine.

Moreover, I do not want the shared libraries to be listed. If I do -d "^txt" it isn't showing some other db files which I want to be shown. Is there any other way to grep out the .so files?

xelco52
  • 5,257
  • 4
  • 40
  • 56
user1071840
  • 3,522
  • 9
  • 48
  • 74
  • FYI for the Internet passer-by: I wanted to solve a similar problem--counting the number of open files per process, specifically Java. I did it like this: `pgrep java | xargs -I {} bash -c 'printf {}; lsof -p {} | wc -l'` – Nick Chammas Nov 09 '16 at 19:43

2 Answers2

18

The lsof -p option takes a comma-separated list of PIDs. The way you're using xargs will pass the pids as separate arguments leading some to be interpreted as filenames.

Try lsof -p $(your grep | tr '\012' ,) That's going to have a trailing comma, I'm not sure if lsof will care but you could sed it off if necessary.

Ben Jackson
  • 90,079
  • 9
  • 98
  • 150
5

You can use xargs -L1 lsof -p to run lsof once per pid.

Even better: use lsof -c to list all open files from commands matching a specific pattern:

lsof -c bas # list all processes with commands starting with 'bas'
lsof -c '/ash$/x' # list all commands ending with 'ash' (regexp syntax)
nneonneo
  • 171,345
  • 36
  • 312
  • 383