We have a shared folder which contains some files that needs to be processed. We also have 3 UNIX servers which runs a shell script which take and process one file each time. At the end of the script the file it's moved away. The 3 UNIX server doesn't communicate each other, and they are not aware of each other.
In your opinion what is the best way to guarantee that each file will be processed one time, without raising concurrent access issues\error ?