I am currently using Gensim LDA for topic modeling.
While Tuning hyper-parameters I found out that the model always gives negative log-perplexity
Is it normal for model to behave like this?? (is it even possible?)
if it is, is smaller perplexity better than bigger one? (-100 is better than -20??)