0

I have a NN model below in jupyter notebook:

b = tf.Variable(tf.zeros([target_count]))
W = tf.Variable(tf.zeros([feature_count, target_count]))
y = tf.nn.softmax(tf.matmul(x,W) + b)
cross_entropy = tf.reduce_mean(-tf.reduce_sum(y_ * tf.log(y), 
reduction_indices=[1]))
correct_prediction = tf.equal(tf.argmax(y,1), tf.argmax(y_,1))
accuracy = tf.reduce_mean(tf.cast(correct_prediction, tf.float32))
train_op = 
tf.train.GradientDescentOptimizer(learning_rate).minimize(cross_entropy)

I am using following script to train this model

for epoch in range(training_epochs):
   curr_data_batch, curr_target_batch = sess.run([data_batch, target_batch], 
   options=run_options)
   init_op = tf.global_variables_initializer()
   sess.run(init_op)

   sess.run([train_op], feed_dict={x: curr_data_batch, y_: 
   curr_target_batch})

For the purpose of multi-label classification, I want to create n binary classifiers. How to modify this code to create n classifiers?
Do I need to create separate graph for each model?
As input data is same for all models but output y_ will be different, will I be able to share input data across models, if I create different graphs?
Do I need to create different session for each model?
If all models are in different graph, while predicting can I run all models on same input data and predicted results?

Daniel Widdis
  • 8,424
  • 13
  • 41
  • 63
Ravikrn
  • 387
  • 1
  • 3
  • 19
  • 1
    `tf.nn.sigmoid_cross_entropy_with_logits` can do N binary classifications at once. You will have one graph and one session. See this question - https://stackoverflow.com/q/47034888/712995 – Maxim Feb 01 '18 at 12:18
  • @Maxim will it be N independent models? (Similar to creating N models separately). I don't want weights to be shared. – Ravikrn Feb 01 '18 at 12:39

1 Answers1

0

You can try variable scoping for sharing the code in you model.

So basically you would have one function, say called make_classification which will have weights and biases intialized each time with a different scope.

To sum it all, variable scoping (based on looping against the number n) will allow you to reuse your code without sharing the weights.

However, multi-label classification can be easily done in tensorflow using existing API.

user3480922
  • 564
  • 1
  • 10
  • 22
  • My dataset has 1000 labels. When using "tf.nn.sigmoid_cross_entropy_with_logits" with 1000 sparse labels, I will run into issue of imbalanced class. As discussed [here](https://github.com/BartyzalRadek/Multi-label-Inception-net/issues/8) – Ravikrn Feb 01 '18 at 14:07
  • I want input value to be shared and not weights. Can I use variable scoping to make this happen? – Ravikrn Feb 01 '18 at 17:00
  • Yes. That is exactly scoping would allow you to do. Share the inputs and underlying network code but not weights. Please go through the docs to see the example on sharing the convolution layer. – user3480922 Feb 02 '18 at 10:11