2

I want to force some independent variables in the elastic net regression model. There is an error when using penalty.factor in the cva.glmnet() function. Any suggestions and comments are appreciated.

# --------------- import data
# outcome: am
# ten predictors: mpg+cyl+disp+hp+drat+wt+qsec+vs+gear+carb
mydata<-mtcars
mydata$am<-factor(mydata$am,levels=c(0,1),labels=c("no","yes"))
mydata$gear<-factor(mydata$gear,levels=c(3,4,5),labels=c("level3","level4","level5"))

str(mydata)

'data.frame':   32 obs. of  11 variables:
 $ mpg : num  21 21 22.8 21.4 18.7 18.1 14.3 24.4 22.8 19.2 ...
 $ cyl : num  6 6 4 6 8 6 8 4 4 6 ...
 $ disp: num  160 160 108 258 360 ...
 $ hp  : num  110 110 93 110 175 105 245 62 95 123 ...
 $ drat: num  3.9 3.9 3.85 3.08 3.15 2.76 3.21 3.69 3.92 3.92 ...
 $ wt  : num  2.62 2.88 2.32 3.21 3.44 ...
 $ qsec: num  16.5 17 18.6 19.4 17 ...
 $ vs  : num  0 0 1 1 0 1 0 1 1 1 ...
 $ am  : Factor w/ 2 levels "no","yes": 2 2 2 1 1 1 1 1 1 1 ...
 $ gear: Factor w/ 3 levels "level3","level4",..: 2 2 2 1 1 1 1 2 2 2 ...
 $ carb: num  4 4 1 1 2 1 4 2 2 4 ...

# --------------- do elastic net cross-validation for alpha and lambda simultaneously
# works
library("glmnetUtils")

set.seed(12345)
cvfit<-cva.glmnet(am~mpg+cyl+disp+hp+drat+wt+qsec+vs+gear+carb,
                  family="binomial",
                  alpha=seq(from=0,to=1,by=0.05),
                  nfolds=10,
                  data=mydata)

# --------------- force three variables of "cyl", "disp", "hp" in the final model
# does not work
set.seed(12345)
cvfit<-cva.glmnet(am~mpg+cyl+disp+hp+drat+wt+qsec+vs+gear+carb,
                  family="binomial",
                  penalty.factor=c(0,0,0,1,1,1,1,1,1,1),
                  alpha=seq(from=0,to=1,by=0.05),
                  nfolds=10,
                  data=mydata)

Error in approx(lambda, seq(lambda), sfrac) : 
  need at least two non-NA values to interpolate
Jing
  • 67
  • 5

1 Answers1

2

Also since gear is a factor, you need to have 2 more in the penalty.factor. And cva.glmnet() keeps all dummy variables for a factor, rather than omitting one, so that the regularisation will shrink the effects toward the overall mean. If 1 level is omitted, shrinking the remaining coefficients to 0 forces the predictions towards the baseline level (see below comments by author).

In any case, you can check over the first example you have:

set.seed(12345)
cvfit<-cva.glmnet(am~mpg+cyl+disp+hp+drat+wt+qsec+vs+gear+carb,
                  family="binomial",
                  alpha=seq(from=0,to=1,by=0.05),
                  nfolds=10,
                  data=mydata)

dim(cvfit$modlist[[1]]$glmnet.fit$beta)
[1]  12 100

So you need a vector of 12 for penalty.factor. If I run the below, it works:

cvfit<-cva.glmnet(am~mpg+cyl+disp+hp+drat+wt+qsec+vs+gear+carb,
                  family="binomial",
                  penalty.factor=c(0,0,0,1,1,1,1,1,1,1,1,1), 
                  alpha=seq(from=0,to=1,by=0.05),
                  nfolds=10,
                  data=mydata)

You can see the first 3 variables are never zero:

              [,1] [,2] [,3] [,4] [,5] [,6] [,7] [,8] [,9] [,10] [,11] [,12] [,13]
mpg           0    0    0    0    0    0    0    0    0     0     0     0     0
cyl           0    0    0    0    0    0    0    0    0     0     0     0     0
disp          0    0    0    0    0    0    0    0    0     0     0     0     0
hp            0    6    7    7    7    8    9    9   10    11    12    14    15
drat          0    7    7    7    8    8    8    9   10    10    11    13    14
wt            0   13   14   15   16   17   18   19   20    21    22    23    25
qsec          0    1    1    1    1    1    1    1    1     1     1     1     1
vs            0   11   12   13   14   15   15   16   17    17    18    19    19
gearlevel3    0    4    4    5    5    5    5    5    5     6     6     6     7
gearlevel4    0   50   49   50   51   52   53   55   58    61    65    70    77
gearlevel5    0    1    1    1    1    1    1    1    1     1     1     1     1
carb          0   12   17   23   29   37   46   62   65    70    75    80    88
           [,14] [,15] [,16] [,17] [,18] [,19] [,20] [,21]
mpg            0     0     0     0     0     0     0     0
cyl            0     0     0     0     0     0     0     0
disp           0     0     0     0     0     0     0     0
hp            17    20    22    26    32    40    62    73
drat          16    19    23    27    33    41    58    73
wt            26    28    30    33    38    48    78    73
qsec           1     1     1     1     1     1     1     2
vs            20    22    23    25    29    35    58    73
gearlevel3     7     8     8     9    11    14    37    73
gearlevel4    84    87    85    84    82    80    78    73
gearlevel5     1     1     1     1     1     1     1     1
carb          88    87    85    84    82    80    78    73
StupidWolf
  • 45,075
  • 17
  • 40
  • 72
  • 3
    glmnetUtils author here. Good answer (+1). The reason glmnetUtils keeps all dummy variables for a factor, rather than omitting one, is so that the regularisation will work properly. If 1 level is omitted, shrinking the remaining coefficients to 0 means making the efffects more similar to the baseline level, which is usually not very meaningful. Retaining all levels shrinks the effects toward the overall mean, which _is_ meaningful. – Hong Ooi Nov 02 '20 at 08:32
  • Hi @HongOoi, did not realise you are the author. Yup I see, the intercept is not penalized, so you want to avoid forcing it towards the reference level. Hopefully I got this correct in the edited answer, feel free to edit. Thanks for the followup :) – StupidWolf Nov 03 '20 at 08:47