I'm studying pattern recognition, so I make 2 classes of data and separated it using my model. My data only can assume two values, true and false.
for plot my results I used confusionMatrix, and when I was interpreting the result a doubt happens.
Can confusionMatrix give me a false accuracy ? For example:
I have 10 itens, 5 true and 5 false, my classifier predict 8 correct and 2 wrongs,so 1 of wrong should be true and was classified was false and other item should be false and was true. In this case the result are 5 true and 5 false. in "Help" of R Studio I cannot see if confusionMatrix compare item by item or only sum of possibles results.