-1

I'm working with a machine which treats data from different sources and will save the results to the directory accordingly. My task is to find the latest result and follow up with data manipulation in R.

Since we can't anticipate which source the new data will come, so, it seems to me the result will go randomly, and I have no control about where the results be saved.

How can I know the location the new file goes? Must I keep a large File System myself besides the system?

joran
  • 169,992
  • 32
  • 429
  • 468
Grec001
  • 1,111
  • 6
  • 20
  • Yes, I need a R answer, so I can move on with R. Thanks for adding it. – Grec001 Aug 19 '19 at 02:19
  • 1
    If you are on Linux you could use inotifywait to launch an R script when the directory of interest changes. If on Windows see https://stackoverflow.com/questions/3517460/is-there-anything-like-inotify-on-windows – G. Grothendieck Aug 19 '19 at 03:06

1 Answers1

2

Try this.

  1. Create a test directory and add some text file named a.txt in it.
  2. Run initial = list.files("~/test/", recursive = TRUE) to list all files in test.
  3. Add another file named b.txt to simulate creation of a file
  4. Run current = list.files("~/test/", recursive = TRUE) again to list the files in test
  5. Then you can compare current with initial
current[!current %in% initial]
#[1] "b.txt"
d.b
  • 32,245
  • 6
  • 36
  • 77
  • 1
    Isn't this what they were trying to avoid? "Must I keep a large File System myself besides the system?" But I don't know of any other OS-level API to trigger actions on file creation. – Brian Aug 19 '19 at 02:57
  • I am looking for R functions/packages for NTFS or FAT – Grec001 Aug 19 '19 at 04:04