2

I'm using the multinom function from the nnet package to do multinomial logistic regression in R. When I fit the model, I expected to get parameter estimates on the logit scale. However, transforming variables with the inverse logit doesn't give probability estimates that match predicted examples, see example below.

The help file states that "A log-linear model is fitted, with coefficients zero for the first class", but how do I transform parameter estimates to get predicted effects on the probability scale?

library("nnet")

set.seed(123)

# Simulate some simple fake data
groups <- t(rmultinom(500, 1, prob = c(0.05, 0.3, 0.65))) %*% c(1:3)
moddat <- data.frame(group = factor(groups))

# Fit the multinomial model
mod <- multinom(group ~ 1, moddat)
predict(mod, type = "probs")[1,] # predicted probabilities recover generating probs

# But transformed coefficients don't become probabilities
plogis(coef(mod))       # inverse logit
1/(1 + exp(-coef(mod))) # inverse logit

Using predict I can recover the generating probabilities:

   1    2    3 
0.06 0.30 0.64 

But taking the inverse logit of the coefficients does not give probabilities:

  (Intercept)
2   0.8333333
3   0.9142857
missng
  • 199
  • 13
  • This is more of an interpretation question and thus more appropriate for CrossValidated.com. But don't post it there because it's already been asked and answered: https://stats.stackexchange.com/questions/17196/interpreting-expb-in-multinomial-logistic-regression – IRTFM Oct 15 '21 at 22:51
  • Thank you for the heads up, I actually found a different post that describes the solution -- the softmax transformation https://stackoverflow.com/questions/17283595/fitted-values-for-multinom-in-r-coefficients-for-reference-category?rq=1 – missng Oct 17 '21 at 22:50
  • I can’t retract my close vote or change the reasoning, but the question should be closed as a duplicate. – IRTFM Oct 18 '21 at 15:37
  • Apologies, I didn't realize. Should I delete this question? – missng Nov 05 '21 at 16:55
  • If you look at your question and think it has unique features that might allow it to be found by persons searching on this question that would not be found in a search that would have otherwise found by the duplicate identified by @missing then you should leave it up. When I looked at the two questions it seemed as though your answer did add some useful terminology for search efforts. So I'm now in the curious situation of having just now upvoted your question and earlier having downvoted your answer (on the basis of duplication). I'll reverse my downvote if you can edit it even slightly. – IRTFM Nov 05 '21 at 17:04

1 Answers1

0

The inverse logit is the correct back transformation for a binomial model. In the case of a multinomial model, the appropriate back transformation is the softmax function, as described in this question.

The statement from the documentation that a "log-linear model is fitted with coefficient zero for the first class" essentially means that the reference probability is set to 0 on the link scale.

To recover the probabilities manually from the example above:

library("nnet")
set.seed(123)

groups <- t(rmultinom(500, 1, prob = c(0.05, 0.3, 0.65))) %*% c(1:3)
moddat <- data.frame(group = factor(groups))
 
mod <- multinom(group ~ 1, moddat)
# weights:  6 (2 variable)
# initial  value 549.306144 
# final  value 407.810115 
# converged
predict(mod, type = "probs")[1,] # predicted probabilities recover generating probs
#   1    2    3 
# 0.06 0.30 0.64 

# Inverse logit is incorrect
1/(1 + exp(-coef(mod))) # inverse logit
#   (Intercept)
# 2   0.8333333
# 3   0.9142857

 
# Use softmax transformation instead
 softmax <- function(x){
   expx <- exp(x)
   return(expx/sum(expx))
 }
 
# Add the reference category probability (0 on link scale) and use softmax tranformation
all_coefs <- rbind("1" = 0, coef(mod)) 
softmax(all_coefs)
#   (Intercept)
# 1        0.06
# 2        0.30
# 3        0.64
missng
  • 199
  • 13
  • The proper action is not to offer an answer to a duplicate question but rather to flag it. – IRTFM Oct 18 '21 at 15:38