Tensorflow variables support constraints, and this includes variables created via add_weight
. See the docs here.
For example, if you want to force a variable to have values 0 < x < 1:
self.add_weight(shape=some_shape, constraint=lambda x: tf.clip_by_value(x, 0, 1))
In general, constraint
should be a function; this function will take the variable as input and returns a new value for the variable. In this case, clipped at 0 and 1.
Note that the way this is implemented is that this function is simply called on the variable after the optimizer does its gradient step. This means that values that "want" to be outside the range will be clipped to hard 0s and 1s, and you might end up with lots of values precisely at this boundary. So as @y.selivonchyk notes, this is not "mathematically sound", i.e. the gradients don't know about the constraint. You might want to combine the constraint with the regularization they propose for the best effect.