3

I'm trying to use model-based recursive partitioning (MOB) with the mob() function (from the partykit package) to separate several curves that were derived using the nls() function. I had to define my model and determine the starting values. I've been trying to see if this could be used with the mob() function to no avail.

I tried following this example on page 7: https://cran.r-project.org/web/packages/partykit/vignettes/mob.pdf I created a fit function that estimates the starting values and would return the estimates etc. of the nls(). But I can't seem to get anything going after that. I'd like to know if it is at all possible to use a custom model, with coefficients and both dependent and independent variables and to include them in mob() and get it to work. I tried the lmtree() function but of course this will only give a straight line.

My code is below. Basically I use a segmented linear regression to get the starting values of a double exponential curve that I am using. This is the furthest I got basically. The parameter estimates give an error etc, if you even get past that it just won't run. I just need to know if it is at possible for the mob() function to run nls().

I loaded sample data, but if it is possible to use the nls()

    photo.try <- function(y, x,start = NULL, weights = NULL, offset = NULL, estfun = FALSE, object = TRUE) 
        {
            lin.mod1 <- lm(y ~ x)
            segmented.mod.2 <- segmented(lin.mod1, seg.Z = ~x, psi=1)
            segmented.mod1 <- segmented(lin.mod1, seg.Z = ~x, psi =  segmented.mod.2$psi[1,2])
            nls(y ~ (a*exp(-b * x) - c* exp(-d* x)), start = list(a = -1*(intercept(segmented.mod1)[[1]][1,1]) , b = slope(segmented.mod1)[[1]][1,1], 
            c = -1*(intercept(segmented.mod1)[[1]][2,1]), 
            d = -1*slope(segmented.mod1)[[1]][2,1]))

        }

photo_form <- Pn ~ (a*exp(-b * PAR) - c* exp(-d* PAR))| Species


photo_tree <- mob(photo_form, data = eco, fit = (photo.try))

Here is my sample data:

eco <- structure(list(Species = structure(c(1L, 1L, 1L, 1L, 1L, 1L, 
1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 
1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 
1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 
1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 
1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 
1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 1L, 2L, 2L, 2L, 2L, 2L, 2L, 
2L, 2L, 2L, 2L, 2L, 2L, 2L, 2L, 2L, 2L, 2L, 2L, 2L, 2L, 2L, 2L, 
2L, 2L, 2L, 2L, 2L, 2L, 2L, 2L, 2L, 2L, 2L, 2L, 2L, 2L, 2L, 2L, 
2L, 2L, 2L, 2L, 2L, 2L, 2L, 2L, 2L, 2L, 2L, 2L, 2L, 2L, 2L, 2L, 
2L, 2L, 2L, 2L, 2L, 2L, 2L, 2L, 2L, 2L, 2L, 2L, 2L, 2L, 2L, 2L, 
2L, 2L, 2L, 2L, 2L, 3L, 3L, 3L, 3L, 3L, 3L, 3L, 3L, 3L, 3L, 3L, 
3L, 3L, 3L, 3L, 3L, 3L, 3L, 3L, 3L, 3L, 3L, 3L, 3L, 3L, 3L, 3L, 
3L, 3L, 3L, 3L, 3L, 3L, 3L, 3L, 3L, 3L, 3L, 3L, 3L, 3L, 3L, 3L, 
3L, 3L, 3L, 3L, 3L, 3L, 3L, 3L, 3L, 3L, 3L, 3L, 3L, 3L, 3L, 3L, 
3L, 3L, 3L, 3L, 3L, 3L, 3L, 3L, 3L, 3L, 3L, 3L, 3L, 3L, 3L, 3L, 
3L, 3L, 3L, 3L, 3L, 3L, 3L, 3L, 3L, 3L, 3L, 3L, 3L, 3L, 3L, 3L, 
3L, 3L, 3L, 3L, 3L, 3L, 3L, 3L, 3L, 3L, 4L, 4L, 4L, 4L, 4L, 4L, 
4L, 4L, 4L, 4L, 4L, 4L, 4L, 4L, 4L, 4L, 4L, 4L, 4L, 4L, 4L, 4L, 
4L, 4L, 4L, 4L, 4L, 4L, 4L, 4L, 4L, 4L, 4L, 4L, 4L, 4L, 4L, 4L, 
4L, 4L, 4L, 4L, 4L, 4L, 4L, 4L, 4L, 4L, 4L, 4L, 4L, 4L, 4L, 4L, 
4L, 4L, 4L, 4L, 4L, 4L, 4L, 4L, 4L, 4L, 4L, 4L, 4L, 4L, 4L, 4L, 
4L, 4L, 4L, 4L, 4L, 4L, 4L, 4L, 4L, 4L, 4L, 4L, 4L, 4L, 4L, 4L, 
4L, 4L, 4L, 4L, 4L, 4L, 5L, 5L, 5L, 5L, 5L, 5L, 5L, 5L, 5L, 5L, 
5L, 5L, 5L, 5L, 5L, 5L, 5L, 5L, 5L, 5L, 5L, 5L, 5L, 5L, 5L, 5L, 
5L, 5L, 5L, 5L, 5L, 5L, 5L, 5L, 5L, 5L, 5L, 5L, 5L, 5L, 5L, 5L, 
5L, 5L, 5L, 5L, 5L, 6L, 6L, 6L, 6L, 6L, 6L, 6L, 6L, 6L, 6L, 6L, 
6L, 6L, 6L, 6L, 6L, 6L, 6L, 6L, 6L, 6L, 6L, 6L, 6L, 6L, 6L, 6L, 
6L, 6L, 6L, 6L, 6L, 6L, 6L, 6L, 6L, 6L, 6L, 6L, 6L, 6L, 6L, 6L, 
6L, 6L, 6L, 6L, 6L, 6L, 6L, 6L, 6L, 6L, 6L, 6L, 6L, 6L, 6L, 6L, 
6L, 6L, 6L, 6L, 6L, 6L, 6L, 6L, 6L, 6L, 6L, 6L, 6L, 6L, 6L, 6L, 
6L, 6L, 6L, 6L, 6L, 6L, 6L, 6L, 6L, 6L, 6L, 6L, 6L, 6L, 6L, 6L, 
6L, 6L, 6L, 6L, 6L, 6L, 6L, 6L, 6L, 6L, 6L, 6L, 6L, 6L, 6L, 6L, 
6L, 6L, 6L, 7L, 7L, 7L, 7L, 7L, 7L, 7L, 7L, 7L, 7L, 7L, 7L, 7L, 
7L, 7L, 7L, 7L, 7L, 7L, 7L, 7L, 7L, 7L, 7L, 7L, 7L, 7L, 7L, 7L, 
7L, 7L, 7L, 7L, 7L, 7L, 7L, 7L, 7L, 7L, 7L, 7L, 7L, 7L, 7L, 7L, 
7L, 7L, 7L, 7L, 7L, 7L, 7L, 7L, 7L, 7L, 7L, 7L, 7L, 7L, 7L, 7L, 
7L, 7L, 7L, 7L, 7L, 7L, 7L, 7L, 7L, 7L, 7L, 7L, 7L, 7L, 7L, 7L, 
7L, 7L, 7L, 7L, 7L, 7L, 7L, 7L, 7L, 7L, 7L, 7L, 7L, 7L, 7L, 7L, 
7L, 7L, 7L, 7L, 7L, 7L, 7L, 7L, 7L, 7L, 7L, 7L, 7L, 7L), .Label = c("Bogum", 
"Clethra", "Eugene", "Guarria", "Melo", "Santa", "Sapium"), class = "factor"), 
    PAR = c(0, 58.6, 101.4, 228.6, 462.4, 904.7, 1565.8, 1992.1, 
    2395.9, 0, 72.8, 125.9, 232.8, 411, 841.1, 1669.6, 2394.5, 
    2394.9, 0, 53.5, 122.1, 231.6, 451, 808.5, 1575, 2394.6, 
    2395.1, 0, 70.9, 104.8, 251.1, 474.6, 858.3, 1612.3, 2393.3, 
    2395.1, 0, 63.1, 124.6, 277.1, 417.7, 824.4, 1649.6, 2377.7, 
    2381.9, 0, 31, 46.5, 115.7, 228.1, 424.3, 822.5, 1644.2, 
    2380.7, 2381.2, 0, 50.1, 118.1, 203.3, 413.2, 804.5, 1587.3, 
    2385.3, 0, 28.8, 36.9, 101.2, 211.7, 423.1, 793, 0, 43.6, 
    106.7, 200.8, 468.6, 808.4, 1567, 2367.1, 2376.5, 0.1, 40.4, 
    104.1, 202.2, 447.3, 794.7, 1546, 2391.8, 2393.3, 0.1, 44.1, 
    107.5, 227.4, 429.6, 802.5, 1668.4, 2391, 0, 42.2, 125.3, 
    126.2, 127.3, 240.3, 433.4, 791, 1600, 2396.8, 2397, 2399.3, 
    0, 72.7, 118.1, 236.9, 425, 828.4, 1613.3, 1615.4, 2396.1, 
    2396.5, 2397.2, 2397.5, 0, 62, 116.2, 235.5, 401.7, 879, 
    879.8, 1552.2, 1553.9, 2394.3, 2394.4, 2394.7, 2396.6, 0, 
    84.8, 135, 209.8, 425.3, 859.1, 1597.6, 2377.3, 2379.5, 2385.1, 
    0.1, 62, 106.3, 226.2, 442.9, 822.5, 1462.3, 2389.8, 2392.1, 
    0.1, 0.1, 73.9, 126, 249.8, 428.5, 846.5, 1555.3, 2390.1, 
    2390.7, 2390.8, 0, 68.7, 121.5, 209.7, 426.2, 803, 1525.9, 
    2389.8, 0, 52.8, 96.9, 211.1, 441.3, 787.9, 1566.5, 2415.2, 
    2415.3, 2415.5, 2417.5, 2417.7, 2418.5, 0.1, 46.5, 108.4, 
    233.5, 461.7, 792.3, 1635.7, 2415.1, 2415.6, 2415.6, 2416.5, 
    2416.6, 2417.8, 0.1, 68.3, 110, 239.5, 531.7, 847.2, 1591.4, 
    2387.3, 2387.6, 2389.7, 0, 49.7, 114.6, 230.6, 397.7, 398.2, 
    817.7, 1596.4, 2376.2, 2376.4, 2380.9, 0, 62.9, 65.5, 117, 
    209, 431.2, 854.5, 1611.3, 2387.3, 2388.5, 2390.3, 0, 49.1, 
    108.9, 200.3, 408.8, 842.2, 1630.2, 2386.5, 2386.8, 2388.2, 
    0, 64.8, 122.9, 226, 422.9, 801.6, 1635.7, 2383.6, 2383.6, 
    2384.3, 2386.1, 0, 36.7, 143.2, 213.7, 444.9, 814.9, 816.2, 
    1496.5, 2384.7, 2386.5, 2388.6, 0.1, 45.6, 105.2, 206.7, 
    494.8, 901.2, 1610.9, 2388, 2388.1, 2388.3, 2388.6, 0, 0.1, 
    45.9, 48.5, 100.2, 209.4, 432.4, 778, 1600.3, 2408.8, 2408.8, 
    0, 71.8, 121.6, 216.4, 404.3, 815.2, 1622, 2414.9, 2415.1, 
    2416.1, 2416.1, 0, 36.2, 97.5, 186.7, 417.9, 840.4, 1597.5, 
    2390.7, 2390.9, 2391.2, 2391.2, 2391.5, 2392.1, 2392.5, 0, 
    53.8, 138.2, 227, 403.6, 800.8, 1642.3, 2396.9, 2397.1, 0, 
    57.9, 95.1, 246.6, 466.8, 796.2, 1574.2, 2395.5, 2397.3, 
    0, 54.9, 94.9, 201.7, 408.1, 822.6, 1596, 2384.1, 0, 55.6, 
    131, 202.5, 419.8, 798.5, 1614, 2387.4, 2387.8, 0, 39.1, 
    109.6, 197.1, 403.3, 835.4, 836.9, 1725.9, 1727.4, 1729.3, 
    1730.6, 54.5, 58.6, 125.4, 226.9, 409, 806.8, 1578.8, 2377.2, 
    2380.1, 2388.3, 0, 68, 127.4, 206.9, 510.5, 814.9, 1561, 
    2404.1, 2404.8, 0, 58.4, 95.3, 229.6, 457.2, 781.5, 1634.4, 
    2399.8, 2401, 2403, 0.1, 56.5, 101.9, 221.8, 394.3, 815.1, 
    1655.4, 2411.8, 2411.9, 0, 50.2, 107.3, 220.5, 434.4, 819.8, 
    1630.6, 2412.4, 2412.6, 0, 48.4, 117.7, 195.3, 403.2, 801, 
    1632.7, 2388.9, 2389.3, 2390.7, 0, 50.4, 120.3, 234.7, 460.3, 
    829.1, 1581.7, 2398.5, 2402.3, 0, 60.8, 105.8, 215.8, 466.6, 
    826, 828.3, 1570.8, 2405.6, 2406.1, 2408.8, 0, 52.6, 106.9, 
    206.5, 414.3, 868.4, 1629.9, 1655.1, 2409.1, 2413, 0, 49.5, 
    100.6, 232.9, 389.4, 808.2, 1588.2, 2412.4, 2413.3, 2415.9, 
    0.1, 70.9, 110.5, 208.4, 409, 807.5, 1579.9, 2382.2, 2382.5, 
    2383.6, 2383.8, 0, 61.5, 106.5, 213.9, 473.8, 814.2, 1561.9, 
    2390.7, 2391.9, 2393.1, 0, 59.9, 64, 112, 216, 397.6, 807.4, 
    1625, 2392.3, 2395.1, 0, 74, 108.8, 109.7, 236.1, 433.6, 
    794.7, 1590.3, 2381.9, 2382.5, 0.1, 56.3, 114.5, 254.1, 487.7, 
    864.3, 1593.5, 2369.3, 2369.3, 2372.3, 2373.9, 0.2, 57.1, 
    110, 201.4, 402.7, 807.2, 1572.9, 2392.8, 2393.5, 0.1, 56.4, 
    122.5, 224.5, 420.2, 853.7, 1502.1, 2390.3, 2392.9, 0, 50.5, 
    53.7, 118.2, 230, 462.8, 794.3, 1513.4, 2391.4, 2392.3, 2393.4, 
    2393.4, 2394.1, 0.1, 49.7, 98.3, 208.3, 383.2, 850.7, 1653.5, 
    2395.3, 2396, 2397.1, 0, 48.4, 121.2, 228.8, 423.9, 817, 
    1708.5, 2389.9, 2389.9, 0, 66.4, 129.7, 209.4, 431.5, 794.1, 
    1673.7, 2383.7, 2384.2, 0, 57, 122.6, 215, 434.1, 838.5, 
    1657.5, 2386.4, 0.1, 22.6, 127.8, 220.4, 404.3, 810.9, 1592.3, 
    2386.7, 2388.7, 0, 49.8, 119.7, 200.5, 463.8, 828.7, 1560.7, 
    2384.5, 2385.7, 2391.2, 0, 73.1, 138.2, 226.6, 408.5, 815.3, 
    1627.3, 2390.2, 2395.4, 0, 61.2, 108.8, 233.8, 417.7, 824.5, 
    1502.7, 2395, 2396.2, 0, 56, 101.4, 226.3, 282.1, 412.9, 
    873.8, 1672.6, 2380.4, 2380.9, 2381.5, 0.1, 70.7, 138, 246, 
    444.4, 817.1, 1643.2, 2391.5, 2391.8, 2392), Pn = c(-0.95, 
    0.75, 0.94, 1.27, 1.5, 1.9, 2.14, 2.35, 2.38, 1.48, 3.51, 
    3.7, 3.99, 4.4, 4.32, 4.52, 4.73, 4.72, 1.97, 3.24, 4.23, 
    4.35, 4.41, 4.66, 4.57, 4.68, 4.88, 1.16, 3.64, 4.05, 4.75, 
    5.42, 5.57, 5.55, 5.89, 5.8, 1.48, 3.89, 4.7, 5.34, 5.47, 
    5.62, 5.71, 5.7, 6.08, 1.26, 0.59, 2.96, 4.34, 5, 4.82, 5.22, 
    5.2, 5.33, 5.51, 1.2, 2.95, 3.67, 3.9, 4.06, 4.59, 4.6, 4.62, 
    2.01, 1.92, 2.41, 2.19, 2.22, 2.41, 2.21, 1.6, 3.29, 3.97, 
    4.39, 4.89, 5.12, 4.93, 5.12, 5.1, 2.39, 3.84, 4.45, 4.63, 
    4.43, 4.93, 4.78, 4.73, 5.04, 3.09, 3.74, 4.03, 3.89, 4.52, 
    4.43, 4.24, 4.26, 1.5, 2.73, 2.83, 3.14, 2.89, 3.39, 2.89, 
    2.84, 3.34, 3.11, 3.16, 3.31, 0.1, 1.17, 1.72, 1.61, 1.64, 
    2.06, 2.17, 1.99, 2.31, 2.14, 2.27, 2.08, 0.17, 1.17, 1.32, 
    1.33, 1.4, 1.8, 1.48, 2, 1.81, 1.95, 2.09, 1.73, 1.85, 2.95, 
    4.33, 4.82, 4.98, 4.97, 5.03, 5.08, 5.22, 5.32, 4.88, 2.17, 
    3.08, 3.32, 3.42, 3.45, 3.67, 3.64, 3.71, 3.71, 2.85, 2.33, 
    3.15, 2.81, 3.22, 2.99, 3.16, 3.33, 3.56, 3.61, 3.63, 2.52, 
    3.55, 4.07, 4.1, 4.17, 4.41, 4.53, 4.56, 2.06, 2.57, 2.91, 
    2.61, 3.08, 3.29, 3.99, 6.49, 5.23, 6.08, 5.74, 4.41, 6.5, 
    1.59, 3.22, 3.59, 3.75, 3.84, 4.5, 4.93, 6.87, 6.75, 6.97, 
    6.53, 6.04, 6.82, 1.28, 3.56, 4.39, 5.27, 5.51, 6.38, 7.05, 
    7.46, 7.16, 7.24, 0.87, 2.45, 3.86, 4.32, 4.57, 4.43, 4.68, 
    4.71, 4.86, 4.36, 4.68, 1.06, 2.79, 4.05, 4.86, 5.48, 5.9, 
    6.38, 6.79, 7.46, 7.12, 7.03, 2.76, 3.92, 3.96, 4.07, 4.2, 
    4.5, 4.91, 5.52, 5.49, 5.33, 2.84, 4.78, 4.83, 4.76, 4.74, 
    4.84, 5.19, 5.59, 5.74, 5.7, 5.65, 3.02, 3.61, 4.14, 4.23, 
    4.45, 4.37, 4.5, 4.6, 4.78, 4.79, 4.85, 2.71, 4.26, 5.42, 
    6.24, 6.58, 6.63, 6.55, 7.29, 7.43, 7.24, 7, 3.36, 2.19, 
    2.86, 2.87, 2.37, 3.16, 2.68, 3, 3.4, 3.6, 4.35, 1.28, 2.62, 
    2.92, 3.3, 3.35, 3.58, 3.73, 4.02, 4, 3.7, 3.75, 1.61, 2.26, 
    2.5, 2.52, 2.71, 2.61, 2.75, 3.19, 2.92, 3.99, 4.36, 3.67, 
    4.14, 4.37, -0.28, 1.91, 2.78, 2.84, 2.96, 3.04, 3.24, 3.44, 
    3.58, 1.78, 4.12, 4.58, 4.33, 4.8, 4.7, 5.02, 5.09, 5.22, 
    2.79, 4.71, 4.89, 4.93, 4.87, 4.92, 4.83, 4.81, 1.66, 3, 
    4.04, 4.35, 4.56, 4.75, 4.75, 4.66, 4.89, 1.56, 2.77, 3.86, 
    3.58, 3.7, 3.76, 3.58, 4.55, 4.63, 4.05, 3.73, 1.76, 2.71, 
    2.98, 3.01, 3.06, 3.22, 2.99, 3.15, 3.32, 3.34, 1.58, 3.76, 
    4.97, 5.21, 5.29, 5.5, 5.59, 5.71, 5.74, 1.89, 2.67, 3.01, 
    3.14, 3.39, 3.57, 3.45, 3.91, 4.11, 3.94, 1.15, 2.88, 3.63, 
    4.32, 4.09, 4.43, 4.58, 4.61, 4.63, 1.23, 2.26, 3.15, 3.33, 
    3.3, 3.61, 3.46, 3.65, 3.67, 0.19, 2.23, 3.43, 4.1, 4.85, 
    5.21, 5.8, 6.27, 6.34, 6.08, 1.94, 3.72, 4.88, 5.51, 6.71, 
    6.51, 6.96, 7.01, 7.4, 0.48, 2.29, 2.5, 2.87, 3.18, 3.51, 
    3.13, 3.86, 4.13, 4.34, 4.03, 1.63, 3.64, 5.15, 5.95, 6.43, 
    6.57, 6.61, 6.51, 6.65, 6.56, 1.93, 3.95, 4.63, 5.66, 6.03, 
    6.28, 6.67, 6.69, 6.95, 6.75, 0.93, 3.14, 3.46, 3.9, 4.19, 
    4.27, 4.77, 5.39, 5.36, 5.24, 5.02, 1.71, 3.31, 3.86, 4.02, 
    4.02, 4.29, 4.36, 4.73, 4.88, 4.59, 1.63, 2.65, 2.63, 2.48, 
    2.93, 3.45, 4.01, 4.67, 5.02, 5.08, 1.93, 3.54, 3.8, 3.81, 
    4.04, 4.17, 4.38, 4.55, 4.99, 4.99, 1.29, 2.73, 3.32, 3.66, 
    3.77, 3.79, 4.14, 4.37, 4.22, 4.1, 4.14, 1.06, 2.89, 3.65, 
    4.01, 4.11, 4.19, 4.66, 5.03, 5.12, 0.97, 2.45, 2.99, 3.32, 
    3.34, 3.35, 3.47, 3.12, 3.38, 2.29, 1.72, 4.33, 5.49, 6.44, 
    6.96, 7.91, 7.49, 8.45, 8.21, 8.17, 8.71, 8.35, 0.29, 2.99, 
    3.93, 4.52, 5.69, 6.23, 6.23, 6.81, 6.96, 6.68, 0.99, 3.67, 
    4.62, 5.52, 5.86, 6.23, 5.91, 6.64, 6.29, -0.08, 3.34, 4.89, 
    6.02, 6.37, 6.59, 6.99, 6.95, 7.2, 0.99, 2.28, 2.72, 2.67, 
    2.99, 3.18, 3.55, 3.58, 1.31, 2.18, 5.55, 7.37, 8.42, 9.14, 
    9.44, 9.26, 9.5, 1.23, 3.11, 5.01, 6.21, 7.14, 7.44, 7.79, 
    7.73, 8.1, 7.96, 1.35, 3.33, 5.67, 6.58, 7.05, 7.36, 7.73, 
    7.75, 7.99, 0.4, 2.25, 2.83, 3.31, 3.55, 3.66, 3.96, 3.54, 
    3.77, 1.46, 2.91, 3.51, 3.64, 4.5, 3.83, 3.96, 4.17, 4.66, 
    4.09, 4.44, 2.41, 4.77, 5.49, 6.05, 6.15, 6.28, 6.6, 6.76, 
    6.75, 6.78)), .Names = c("Species", "PAR", "Pn"), class = "data.frame", row.names = c(NA, 
-628L))
Achim Zeileis
  • 15,710
  • 1
  • 39
  • 49
kurtis
  • 33
  • 4
  • Add sample data by copying the output of `dput` into a new code section (created by starting with four spaces). – lmo May 04 '16 at 20:43
  • I did something..not sure if it was the right thing... – kurtis May 04 '16 at 22:45
  • That's it. The output of `dput` is not the prettiest. – lmo May 05 '16 at 00:14
  • @kurtis This post is from a few years ago, but if you're still working on similar problems, this package may be of interest: https://github.com/marjoleinF/gamtree It allows for the partitioning you want to do, only difference: it uses GAMs as fitted with package mgcv, instead of nls / nlme. I've taken the liberty of using an adjusted version of the dataset you supplied above as an example, see the url above. If you're still working on problems like these, the package may be interesting, and I would be glad to hear about use cases, ideas or suggestions. – Marjolein Fokkema Jun 25 '19 at 13:48

1 Answers1

4

Yes, we can! ;-)

In principle, you were attempting to do the right thing but a few aspects were not quite correct. The main issue is how you pass around the data and the formula: As mob() does not know anything about the way nls() specifies its formulas, a plain formula Pn ~ PAR | Species needs to be used and then the fit function needs to know what to do with the data. The pre-processing offered by mob() can either set up a model matrix (with intercept, dummy/contrast codings, etc.) or a model frame (where factors are still factors etc.). In this case it is easiest to use the default model matrix and then to omit the intercept in the fitting function.

The second problem with your code was that you used the extended specification of the fit function (with estfun and object arguments) but only supplied the fitted model object. With that specification mob() expects that the fit function sets up a suitable list with coefficients and objfun etc.

In combination, this means that your fit function should look like this:

photofit <- function(y, x = NULL, start = NULL, weights = NULL, offset = NULL, ...,
  estfun = FALSE, object = FALSE)
{
  ## only use first real regressor (without intercept)
  x <- x[, 2]

  ## obtain starting values if necessary
  if(is.null(start)) {
    aux_lm <- lm(y ~ x)
    aux_seg_2 <- segmented::segmented(aux_lm, seg.Z = ~ x, psi = 1)
    aux_seg_1 <- segmented::segmented(aux_lm, seg.Z = ~ x, psi = aux_seg_2$psi[1, 2])
    start <- list(
      a = -1 * (segmented::intercept(aux_seg_1)[[1]][1, 1]),
      b =           segmented::slope(aux_seg_1)[[1]][1, 1], 
      c = -1 * (segmented::intercept(aux_seg_1)[[1]][2, 1]), 
      d = -1 *      segmented::slope(aux_seg_1)[[1]][2, 1]
    )
  } else {
    start <- as.list(start)
  }

  ## estimate NLS model
  rval <- nls(y ~ (a * exp(-b * x) - c * exp(-d * x)), start = start)

  ## return processed information for mob()
  list(
    coefficients = coef(rval),
    objfun = deviance(rval),
    estfun = if(estfun) sandwich::estfun(rval) else NULL,
    object = if(object) rval else NULL    
  )
}

And then you can grow the MOB tree. Specifying the verbose = TRUE control option will give you a little bit of progress information while you wait:

photomob <- mob(Pn ~ PAR | Species, data = eco, fit = photofit,
  control = mob_control(verbose = TRUE))
coef(photomob)
##           a             b         c             d
## 4  2.967680 -3.216708e-05  1.519680  1.076879e+01
## 5 -1.811596  1.967366e-02 -3.573079 -4.877852e-05
## 6 -2.772783  1.438087e-02 -4.177953 -7.821814e-05
## 8 -2.427253  1.757744e-02 -4.449105 -1.328930e-04
## 9 -4.579248  1.020021e-02 -5.714575 -7.502393e-05

You can then also visualize the tree. By default a numeric summary is shown in each node but you can also easily display the fitted curves:

plot(photomob)
plot(photomob, terminal_panel = node_bivplot, tnex = 2)

photomob visualization 1

photomob visualization 2

As you see the tree selected five terminal nodes with different parameters. I would recommend that you do some more diagnostics on the model fits in the different nodes because I'm not sure how well all parameters are identified. I'm not very familiar with NLS and might be completely wrong but it seems that not always all parameters can be reliably determined.

As one illustration I do the following: I extract all nine fitted nls objects from the tree. For the model from the root node (node 1) I compute the gradient by summing over all observation-wise gradient contributions (as computed by the estfun() method):

photonls <- refit.modelparty(photomob)
library("sandwich")
colSums(estfun(photonls[[1]]))
##             a             b             c             d 
##  2.010552e-05  5.753230e-02 -1.166331e-04  6.771585e+00 

The gradients of parameters a-c are reasonably close to zero but for d it isn't. This may also affect the inference in mob() which is based on the observation-wise gradient contributions (aka model scores or estimating functions).

In short: What you want to do, can be done! But I would recommend considering a simpler model. If you do, you just need to modify the photofit() function accordingly and run it through mob() again.

Achim Zeileis
  • 15,710
  • 1
  • 39
  • 49
  • It's going to take me a while to digest this but this seems awesome! Thanks a mill. I had initially tried to start with Pn ~ Par|Species..but did not know what to do after that. This is the simplest model actually. But thanks a million! – kurtis May 05 '16 at 21:36
  • Also I have the estimated parameters for each species and also I can derive the parameters in each node. I can use this to check how well the parameters were identified...and it seems ok so far.. – kurtis May 05 '16 at 21:54
  • I added more species (10 species in total) and used the catsplit ="multiway" to split the node into all levels of the categorical variable and the parameter estimates for all except for one species (which normally gives some problems) were correct (if I had just run either the nls or the nlsList). The nlme gives slightly different parameter estimates. I am testing it on some fresh data (with 20 species) to see how well it performs. But all is well so far and it can be used.. – kurtis May 05 '16 at 23:30
  • Good to hear that this is useful. Please accept my answer here on SO so that it appears as being solved. – Achim Zeileis May 07 '16 at 20:11