I am having difficulties applying any callbacks to Keras Tuner hyperparameter optimsier objects. Here is the code I run:
from keras.callbacks import TensorBoard, EarlyStopping
%load_ext tensorboard
BATCH_SIZE = 32
time_stamp = time.time()
tensorboard = TensorBoard(log_dir = " graphs/{}".format(time_stamp))
checkpoint = ModelCheckpoint(filepath = r"D:\Uni work\...\CNN.hdf5" , monitor = 'val_accuracy', verbose = 1, save_best_only = True )
early_stopping = EarlyStopping( monitor="val_loss" , patience= 3, verbose=2)
tuner = BayesianOptimization(build_model, objective = "val_accuracy", max_trials = 30, num_initial_points=2, project_name ="audio_classifier")
tuner.search(x = train_X, y=y_cat_encoded, epochs=35, callbacks = early_stopping, batch_size = BATCH_SIZE, validation_data = (validation_X, y_validation_cat_encoded))
whilst I would like to apply the tensorboard and checkpoint callbacks, it fails simply by passing the early stopping callback. I get the following error:
C:\Anaconda\envs\test\lib\site-packages\kerastuner\engine\tuner.py in _deepcopy_callbacks(self, callbacks)
277 callbacks = copy.deepcopy(callbacks)
278 except:
--> 279 raise ValueError(
280 'All callbacks used during a search '
281 'should be deep-copyable (since they are '
ValueError: All callbacks used during a search should be deep-copyable (since they are reused across trials). It is not possible to do `copy.deepcopy(<tensorflow.python.keras.callbacks.EarlyStopping object at 0x000001802D138100>)
I am not familiar with the term deep-copyable and what it is suggesting in terms of faulty code. Is anyone familiar with how to address this problem?