Suppose I have two layers:
x1 = Conv3D(16, kernel_size=(3, 1, 1), activation='relu', dilation_rate=2)(x)
x2 = Conv3D(16, kernel_size=(3, 1, 1), activation='relu', dilation_rate=3)(x)
Is there a way to share weights between these two layers?
Suppose I have two layers:
x1 = Conv3D(16, kernel_size=(3, 1, 1), activation='relu', dilation_rate=2)(x)
x2 = Conv3D(16, kernel_size=(3, 1, 1), activation='relu', dilation_rate=3)(x)
Is there a way to share weights between these two layers?