0

Q1: How to tune the "hidden" hyperparameter in "classif.h2o.deeplearning"?

I am getting different approaches from stackOverFlow

makeDiscreteParam("hidden", values = list(one = 10, two = c(10, 5, 10)))
makeDiscreteParam(id = "hidden", values = list(a = c(10,10), b = c(20,20,20), c = c(30,30,30)))
makeDiscreteParam(id = "hidden", values = list(a = c(10,10), b = c(100,100)))
makeIntegerVectorParam("hidden", len = 2, lower = 10, upper = 100)

As per definition

hidden: Specifies the number and size of each hidden layer in the model. For example, if c(100,200,100) is specified, a model with 3 hidden layers is generated. The middle hidden layer will have 200 neurons and the first and third hidden layers will have 100 neurons each. The default is c(200,200). For grid search, use the following format: list(c(10,10), c(20,20)). Refer to the section on Performing a Trial Run for more details.

hidden is "integervector", so I can use makeIntegerVectorParam, may I know the syntax for that for below 2 cases (I have Def(200,200) for hidden in getParamSet)

2 hidden layers and 30 neurons in each?
2 hidden layers with different neurons in each say 30,20?

Q2: if I tune say 5 parameters at once it takes long time, shall I tune one by one in makeParamSet and get the optimum value, then shall I combine all with specific values in makeParamSet, is this a right approach?

Regarding Q2: this may not be a correct approach for tune one by one hyperparameters then combine, or atleast whether this gives a starting point

Q3: I am getting 33 Hyperparameters for classif.h2o.deeplearning, is there a way choose the right ones to tune?

hanzgs
  • 1,498
  • 17
  • 44
  • 1
    > is it a correct approach for tune one by one hyperparameters then combine No, hyperparameter values depend on each other during tuning and the outcomes will differ. – pat-s Apr 30 '19 at 09:01
  • 1
    > I am getting 33 Hyperparameters for classif.h2o.deeplearning, is there a way choose the right ones to tune? No, you need to choose them yourself. There is no general answer which hyperpars should be used for an algorithm. – pat-s Apr 30 '19 at 09:04

1 Answers1

0

Finally understood tuning hidden parameter

makeDiscreteParam("hidden", values = list(one = c(30,30), two = c(30, 30, 30), three=c(30, 30, 30, 30)))

This tunes for

one: 2 hidden layers with 30 neurons each

two: 3 hidden layers with 30 neurons each

three: 4 hidden layers with 30 neurons each

hanzgs
  • 1,498
  • 17
  • 44
  • I wonder is it possible to tune for number of neurons in each layer? – hanzgs May 07 '19 at 05:16
  • I have no experience with ANN handling. Just a short note about Stackoverflow and asking questions: Please try to ask only ONE question. This makes it easier to answer for everyone as one dedicated answer can be given. You might also get a higher response rate by doing so. – pat-s May 07 '19 at 19:42
  • Also understanding parameters of algorithms is not related to _mlr_. There are a lot of books out there that describe the algorithm parameters very well that should be the first resource to look at. – pat-s May 07 '19 at 19:44