3

Let's suppose that I have a matrix like this:

Where p, z are some variables to be optimized. This matrix is supposed to to take part in some heavy algebraic shenanigans and in the end I want only p and z to get their new values. That's it, all other elements in the matrix/tensor should remain constant.

Ideally I would like to do something like this:

p = tf.get_variable('p', ())
z = tf.get_variable('z', ())
m = tf.constant(matrix)
m[0,1] = p
m[1,1] = 2 * p
m[1,2] = z
m[2,1] = z - p

Unfortunately, such a point-wise operation seems to be impossible in a Tensorflow. Is there a workaround for this?

G. Mesch
  • 373
  • 1
  • 3
  • 6
  • Possible duplicate of ["freeze" some variables/scopes in tensorflow: stop\_gradient vs passing variables to minimize](https://stackoverflow.com/questions/35298326/freeze-some-variables-scopes-in-tensorflow-stop-gradient-vs-passing-variables) – sascha Oct 21 '17 at 01:28
  • That doesn't seem to be the case unless I've misunderstood the question. – G. Mesch Oct 21 '17 at 14:48
  • It's less about the question but more about the answers (but maybe i misunderstood you). – sascha Oct 21 '17 at 14:49
  • Perhaps I am missing something, but they seem to have little do with my problem. The issue is that I want my tensor (a matrix) to remain constant except for a few elements at the specified indicies. – G. Mesch Oct 21 '17 at 14:56
  • ```tf.Variable(my_weights, trainable=False)```? – sascha Oct 21 '17 at 14:58
  • But how do I incorporate my_weights into a constant tensor at the desired positions? Also, I don't want to prevent them to stop from being trainable, unless by my_weights you meant the "constant" part of the matrix itself. – G. Mesch Oct 21 '17 at 15:01

0 Answers0