Questions tagged [gradienttape]

147 questions
25
votes
6 answers

Applying callbacks in a custom training loop in Tensorflow 2.0

I'm writing a custom training loop using the code provided in the Tensorflow DCGAN implementation guide. I wanted to add callbacks in the training loop. In Keras I know we pass them as an argument to the 'fit' method, but can't find resources on how…
Umair Khawaja
  • 403
  • 1
  • 4
  • 9
7
votes
1 answer

Taking gradients when using tf.function

I am puzzled by the behavior I observe in the following example: import tensorflow as tf @tf.function def f(a): c = a * 2 b = tf.reduce_sum(c ** 2 + 2 * c) return b, c def fplain(a): c = a * 2 b = tf.reduce_sum(c ** 2 + 2 * c) …
marlon
  • 73
  • 4
5
votes
1 answer

tensorflow v2 gradients not shown on tensorboard histograms

I have a simple neural network for which I am trying to plot the gradients using tensorboard by using a callback as below: class GradientCallback(tf.keras.callbacks.Callback): console = False count = 0 run_count = 0 def…
bit
  • 4,407
  • 1
  • 28
  • 50
5
votes
3 answers

How to use Tensorflow BatchNormalization with GradientTape?

Suppose we have a simple Keras model that uses BatchNormalization: model = tf.keras.Sequential([ tf.keras.layers.InputLayer(input_shape=(1,)), tf.keras.layers.BatchNormalization() ]) How to actually use it…
Zuza
  • 2,136
  • 4
  • 20
  • 22
4
votes
1 answer

Error polling for event status: failed to query event: CUDA_ERROR_LAUNCH_FAILED: unspecified launch failure

I have been struggling with this problem for five days and read several posts on StackOverflow, but still cannot get a clear clue of how to solve this problem. People who solved this issue just recommended trying different NVIDIA driver versions…
yuanhang
  • 91
  • 1
  • 7
3
votes
1 answer

Autodiff implementation for gradient calculation

I have worked through some papers about the autodiff algorithm to implement it for myself (for learning purposes). I compared my algorithm in test cases to the output of tensorflow and their outputs did not match in most cases. Therefor i worked…
3
votes
0 answers

How can I parallelize in auto-differentiation with tf.GradientTape?

I would like to auto-differentiate across a rather complex function that I wish to parallelize. I am using TensorFlow 2.x and using tf.GradientTape for differentiation. I have made a toy example that illustrates the point. The auto-differentiation…
Morten Grum
  • 962
  • 1
  • 10
  • 25
3
votes
0 answers

Why tf.GradientTape() has less GPU memory usage when watch model variables manually?

So when I use tf.GradientTape() to automatically monitor the trainable variables in a resnet model, the computer threw an out of memory error. Below is the code: x_mini = preprocess_input(x_train) with tf.GradientTape() as tape: outputs =…
3
votes
2 answers

How to make use of class_weights to calculated custom loss fuction while using custom training loop (i.e. not using .fit )

I have written my custom training loop using tf.GradientTape(). My data has 2 classes. The classes are not balanced; class1 data contributes almost 80% and class2 contributes remaining 20%. Therefore in order to remove this imbalance I was trying to…
3
votes
0 answers

Tensorflow gradient of loss with respect to model output gives None

I'm trying to differentiate my loss function with respect to the model output in the training_step function of a tf.keras.Model. This is my attempt: def train_step(self, data): x, y = data with tf.GradientTape(persistent=True) as…
3
votes
0 answers

Abysmal tf.GradientTape performance compared to tf.gradients() for computing jacobians

SOLUTION BELOW: Scenario: I am trying to compute the jacobian of a user defined function many, many times in a loop. I am able to do this with TF 2's GradientTape as well as the older session based tf.gradients() method. The problem is that…
3
votes
2 answers

GradientTape with Keras returns 0

I've tried using GradientTape with a Keras model (simplified) as follows: import tensorflow as tf tf.enable_eager_execution() input_ = tf.keras.layers.Input(shape=(28, 28)) flat = tf.keras.layers.Flatten()(input_) output = tf.keras.layers.Dense(10,…
kwkt
  • 1,058
  • 3
  • 10
  • 19
2
votes
1 answer

tf.GradientTape giving None gradient while writing custom training loop

I'm trying to write a custom training loop. Here is a sample code of what I'm trying to do. I have two training parameter and one parameter is updating another parameter. See the code below: x1 = tf.Variable(1.0, dtype=float) x2 = tf.Variable(1.0,…
Al Shahreyaj
  • 211
  • 1
  • 9
2
votes
1 answer

How to add multiple losses into gradienttape

I am testing tf.gradienttape. I wrote a model with several output layers, each with an own loss, where i wanted to integrate the gradienttape. My question is: are there specific techniques how to implement the several losses to the gradient as…
st3ff3n
  • 21
  • 1
2
votes
0 answers

Transformer tutorial with tensorflow: GradientTape outside the with statment but still working

Applying the tensorflow tutorial on how to implement a transformer model I had some doubts on the training process. The train_step function is implemented as following : @tf.function(input_signature=train_step_signature) def train_step(inp, tar): …
1
2 3
9 10