0

Suppose I write such a function in R:

fun1 <- function(x,...,simplify=TRUE,sort=TRUE) {
   # do something here ...
}

In the function, ... is supposed to be a number of expressions that are evaluated in specific environments. However, sometimes it is possible that the expression itself is simplify=FALSE or sort=FALSE which are intended for ... not the arguments of fun1.

I learned from some packages that the author avoid using potential conflicts between possible named values for ... and existing argument names. Therefore they write the function in the following manner:

fun1 <- function(.data, ..., .simplify = TRUE, .sort = TRUE) {
  # do something here ...
}

It does not solve the problem but avoids many potential conflicts under the assumption that typical data input will not frequently use .data, .simplify, and .sort in the expression.

What is the best practice to solve or walk around this problem?

Kun Ren
  • 4,715
  • 3
  • 35
  • 50
  • 1
    maybe something like the MoreArgs argument in `mapply`? take a look at the source and see if that is helpful – rawr Jul 06 '14 at 00:07
  • Are you trying to "hard-code" certain `...` arguments? If so, the answers and comments in [this post](http://stackoverflow.com/questions/23856089/how-to-pass-some-but-not-all-further-arguments-with) might be useful. You might also try placing the `...` after the existing arguments. – Rich Scriven Jul 06 '14 at 00:15
  • If you have arguments that match multiple functions in your "fun1", another way could be -instead of `...`- to use arguments whose input is a list of named arguments and call each function in your "fun1" using `do.call`. E.g. `fun1 = function(f2_args, f3_args) do.call(f2, f2_args); do.call(f3, c(f3_args, extra_f3_arg))`. – alexis_laz Jul 06 '14 at 00:16

0 Answers0