What is Precision @ k used for in the outlier case? (when varying k on the same dataset I always get: Precision @ 3016 and I don't get where ELKI has that number from, number of outliers are 1508)
and
precision.average and precision.r?
and f1.maximum?
I know ROCAUC is a measure of how well the algorithm labels the outliers as outliers and the normal as normal objects.
I want to see if the quality of the outlier detection is good. Can I do that with the other measures too?
Computing LDOFs
LDOF for objects: 49534 [100%]
de.lmu.ifi.dbs.elki.algorithm.outlier.lof.LDOF.runtime: 116887 ms
Evaluating using minority class: yes
de.lmu.ifi.dbs.elki.evaluation.outlier.OutlierRankingEvaluation.rocauc: 0.736341684836717
de.lmu.ifi.dbs.elki.evaluation.outlier.OutlierRankingEvaluation.precision.average: 0.10795456476088741
de.lmu.ifi.dbs.elki.evaluation.outlier.OutlierRankingEvaluation.precision.r: 0.16578249336870027
de.lmu.ifi.dbs.elki.evaluation.outlier.OutlierRankingEvaluation.f1.maximum: 0.18336314847942753
ROCAUC: 0.7363416848367167
Precision @ 3016 0.13726790450928383