I would like to modify or append keras input layer to a pretrained inception model. I know that there is a way to pop and append downstream layers. How about upstream layers?
For example, I would like to add a layer which will take my input image and branch it in 3 channels (I know there are other solutions, but let's try it):
from keras.applications.inception_v3 import InceptionV3
from keras.models import Model
from keras.layers import Dense,
base_model = InceptionV3(weights='imagenet', include_top=False)
img = Input(( None, None, 1))
d0 = Dense(3,kernel_initializer='Ones', use_bias=False)
img3 = d0(img)
It turns I cannot set the input attribute easily like base_model.input = img3
-- it raises an exception.
Update:
I actually need to modify both upstream and downstream layers. Currently I am cropping downstream layers in my network in following way:
n_classes = 1
final_activation = 'sigmoid'
ndense=64
dropout=0.5
base_trainable=False
base_model = InceptionV3(weights='imagenet', include_top=False)
img = Input(( None, None, 1))
d0 = Conv2D(3, (1,1), kernel_initializer='Ones', use_bias=False)
img3 = d0(img)
base_model(img3)
# get third Concatenation layer and crop the network on it:
cc=0
poptherest = False
for nn, la in enumerate(base_model.layers):
if type(la) is keras.layers.Concatenate:
if cc==3:
x = la.output
break
cc+=1
base_model.layers = base_model.layers[:nn+1]
#x = [la.output for la in base_model.layers if type(la) is keras.layers.Concatenate][3]
x = GlobalAveragePooling2D()(x)
# let's add a fully-connected layer
x = Dropout(dropout)(x)
x = Dense(ndense, activation='relu')(x)
# and a logistic layer -- let's say we have 200 classes
predictions = Dense(n_classes, activation=final_activation)(x)
# this is the model we will train
model = Model(inputs=img, outputs=predictions)
How do I add the above mentioned modification to my code?