3

I am using Weights & Biases (link) to manage hyperparameter optimization and log the results. I am training using Keras with a Tensorflow backend, and I am using the out-of-the-box logging functionality of Weights & Biases, in which I run

wandb.init(project='project_name', entity='username', config=config)

and then add a WandbCallback() to the callbacks of classifier.fit(). By default, Weights & Biases appears to save the model parameters (i.e., the model's weights and biases) and store them in the cloud. This eats up my account's storage quota, and it is unnecessary --- I only care about tracking the model loss/accuracy as a function of the hyperparameters.

Is it possible for me to train a model and log the loss and accuracy using Weights & Biases, but not store the model parameters in the cloud? How can I do this?

Jimmy Zhao
  • 303
  • 2
  • 17
book_kees
  • 307
  • 1
  • 9

1 Answers1

4

In order to not save the trained model weights during hyperparam optimization you do something like this:

classifier.fit(..., callbacks=[WandbCallback(.., save_model=False)]

This will only track the metrics (train/validation loss/acc, etc.).

ayush thakur
  • 438
  • 3
  • 9