I have a problem with a rMarkdown transformation.
Basically, I have to report the results of a Neural Net.
After the transformation in a pdf file the document is full of Neural net's calculations.
I have included this chunk
title: "Monitoring hydraulic systems"
output: pdf_document
{r setup, include=FALSE}
knitr::opts_chunk$set(echo = TRUE)
{r echo=FALSE, message=FALSE, warning=FALSE}
nNGrid=expand.grid(size=seq(1,9,1),decay=seq(0.1,1,0.1))
neuralNetwork= train(Class ~.,data=training_norm, method="nnet",
trControl=reg_Control,tuneGrid=nNGrid)
nNPredictions = predict(neuralNetwork,newdata=testing_norm)
ty<- confusionMatrix(nNPredictions,testing_norm$Class)
ty$table
to try to stop R from printing messages and warnings.
But this is what I get in my final pdf document:
weights: 67
initial value 721.992732
iter 10 value 519.008521
iter 20 value 488.124903
iter 30 value 456.941810
iter 40 value 330.558805
iter 50 value 259.373044
It's about 500 pages of calculations!
How can I solve this??
I have to delete all those calculations from my final document.
Can anyone help please??
I forgot to mention that I have excluded verboseIter=T
from my rg_Control.
But it still doesn't work.
I don't know what to do!
For the sake of completeness this is my reg_COntrol
Tabella_per_previsioni$Class=factor(Tabella_per_previsioni$Class, labels=c("Alterato","Ottimale","Pericolo"))
set.seed(32343)
reg_Control = trainControl("repeatedcv", number = 5, repeats=5, classProbs =T)
inTrain = createDataPartition(y=Tabella_per_previsioni$Class,p=0.75, list=FALSE)
training = Tabella_per_previsioni[inTrain,]
testing = Tabella_per_previsioni[-inTrain,]
train_stats <- preProcess(training, method="range")
training_norm <- predict(train_stats, training)
testing_norm <- predict(train_stats, testing)
Forgot to mention I'm using Caret.