2

I tried to run an nplr function:

require(nplr)

b<- data.frame("Conc" = c(0.03125, 0.046875, 0.0625, 0.09375, 0.1875),"BKA" =c(1.89970905356837, 98.2543214102345, 98.0660619544754, 98.1858634263221, 98.0489474584974))

nplr(x = b$Conc, y = b$BKA)

which failed with the error

Testing pars...
The 3-parameters model showed better performance
Error in nplr(x = b$Conc, y = b$BKA) : 
nplr failed and returned constant fitted values.
        Your data may not be appropriate for such model. 
In addition: Warning message:
In nlm(f = .sce, p = .initPars(x, y, 5), x = x, yobs 
= y, .weight,  :
  NA/Inf replaced by maximum positive value

Does anyone have any ideas for why it's giving this error?

Laura
  • 23
  • 3

1 Answers1

1

Because nplr expects the response values to be between 0 and 1, you can rescale the y value by dividing by 100 and the error is gone, e.g.:

require(nplr)
require(dplyr)

b<- data.frame("Conc" = c(0.03125, 0.046875, 0.0625, 0.09375, 0.1875),"BKA" =c(1.89970905356837, 98.2543214102345, 98.0660619544754, 98.1858634263221, 98.0489474584974))

b <- 
    b %>% 
    mutate(BKA = BKA/100)

nplr(x = b$Conc, y = b$BKA)
Matt
  • 126
  • 10
  • I feel this answer is somewhat incomplete: looking at the source code (I have not tried this),`nplr`'s 4- and 5-parameters models should be well capable of fitting data with arbitrary `(top, bottom)` values, so these should return a much better than the 3-parameters model. So I suspect that this is as much an issue of convergence (`nplr` using poor initialization rather than an incorrect model) at least for the 4- and 5-parameter options. – bers Feb 21 '20 at 07:16
  • (I have validated part of my previous comment by showing that `nplr` can well fit `data.frame("Conc" = c(1, 2, 3, 4, 5, 6, 7), "BKA" = c(0, 10, 20, 50, 80, 90, 100))` with a weighted GoF of 0.9999683. It does use the 5-parameters model for the result.) – bers Feb 21 '20 at 07:16