2

Suppose I write a wrapper function of jsonlite::fromJSON but use different default value for simplifyDataFrame=:

read.json <- function(txt, ...) {
  jsonlite::fromJSON(txt, simplifyDataFrame = FALSE, ...)
}

read.json is thus a wrapper function of jsonlite::fromJSON with different default parameter. However, if user specifies simplifyDataFrame = TRUE to override the default of read.json there would be an argument name clash.

> read.json('{"a":1}')
$a
[1] 1

> read.json('{"a":1}', simplifyDataFrame = TRUE)
Error in jsonlite::fromJSON(txt, simplifyDataFrame = FALSE, ...) : 
  formal argument "simplifyDataFrame" matched by multiple actual arguments

What is the best/correct way to write a wrapper function with different default values of parameters that does not lead to potential name clash?

Kun Ren
  • 4,715
  • 3
  • 35
  • 50
  • You could define your `read.json` with the `simplifyDataFrame=FALSE` (and any other parameters you want) in the parameter list. `fromJSON` has five named parameters before the `...`. Just replicate those in your function definition and then include the `...`. – hrbrmstr Oct 18 '14 at 11:14
  • Thanks @hrbrmstr! Is there a more robust way? I mean, if the wrapped function has many parameters and may add more in future, is there a more robust way to do this so that the wrapper function does not have to follow the changes when new parameters are added to wrapped function? – Kun Ren Oct 18 '14 at 11:25
  • 1
    Aye. You can use `match.call` to get the parameters passed in from the `...` in your function and either exclude them or change what the default you were hardcoding. Of note: you can also use `formals` to see all named parameters from your target function. – hrbrmstr Oct 18 '14 at 11:34
  • It seems that finally I have to use these meta functions to do the work :) – Kun Ren Oct 18 '14 at 11:42
  • I don't quite understand what you mean by "more robust". If the wrapped function has new arguments, they will be taken care of by the `...`. Can you elaborate? – flodel Oct 18 '14 at 11:51
  • yes, no? Also, have I answered your question? – flodel Oct 24 '14 at 10:53

1 Answers1

1

Like @hrbrmstr suggested, the simplest is to do:

read.json <- function(txt, simplifyDataFrame = FALSE, ...) {
   jsonlite::fromJSON(txt, simplifyDataFrame = simplifyDataFrame, ...)
}

If you are going to do that with a lot of arguments and want to avoid typing too much, then I'd recommend you use the functional provided by the functional package:

library(functional)
read.json <- Curry(jsonlite::fromJSON, simplifyDataFrame = FALSE)

The code for Curry is as follows:

function (FUN, ...) {
    .orig = list(...)
    function(...) do.call(FUN, c(.orig, list(...)))
}

I once recommended Curry here https://stackoverflow.com/a/15636912/1201032 and Hadley made the following comment, offering more alternatives:

There's also plyr::partial and in ptools, %<<%, %>>% and %()%. It's not clear how partial evaluation and lazy evaluation of arguments should interact, and each package takes a slightly different approach.

This was before dplyr and magrittr were written; I imagine similar functions have been ported there as well.

Community
  • 1
  • 1
flodel
  • 87,577
  • 21
  • 185
  • 223
  • Curry was also recommended [in this post](http://stackoverflow.com/questions/23856089/how-to-pass-some-but-not-all-further-arguments-with/23857006#23857006). It can be quite useful. – Rich Scriven Oct 18 '14 at 14:37