Mentioning the solution below using Tensorflow Keras
.
In order to be able to access the Activations
, first we should pass one or more images and then the Activations corresponds to those Images.
Code for passing an Input Image
and it's preprocessing
is shown below:
from tensorflow.keras.preprocessing import image
Test_Dir = '/Deep_Learning_With_Python_Book/Dogs_Vs_Cats_Small/test/cats'
Image_File = os.path.join(Test_Dir, 'cat.1545.jpg')
Image = image.load_img(Image_File, target_size = (150,150))
Image_Tensor = image.img_to_array(Image)
print(Image_Tensor.shape)
Image_Tensor = tf.expand_dims(Image_Tensor, axis = 0)
Image_Tensor = Image_Tensor/255.0
Once the Model is defined, We can access Activations
of a any Layer using the code shown below (with respect to Cat and Dog Dataset):
# Extract the Model Outputs for all the Layers
Model_Outputs = [layer.output for layer in model.layers]
# Create a Model with Model Input as Input and the Model Outputs as Output
Activation_Model = Model(model.input, Model_Outputs)
Activations = Activation_Model.predict(Image_Tensor)
Output of the First Fully Connected Layer
(with respect to Cat and Dog Data) is:
print('Shape of Activation of First Fully Connected Layer is', Activations[-2].shape)
print('------------------------------------------------------------------------------------------')
print('Activation of First Fully Connected Layer is', Activations[-2])
It's Output is shown below:
Shape of Activation of First Fully Connected Layer is (1, 512)
------------------------------------------------------------------------------------------
Activation of First Fully Connected Layer is [[0. 0. 0. 0. 0. 0.
0. 0. 0. 0. 0. 0.
0. 0. 0. 0. 0. 0.
0. 0.02759874 0. 0. 0. 0.
0. 0. 0.00079661 0. 0. 0.
0. 0. 0. 0. 0. 0.
0. 0. 0. 0. 0. 0.
0. 0. 0. 0. 0. 0.
0. 0. 0. 0. 0. 0.
0. 0. 0. 0. 0. 0.
0. 0. 0. 0.04887392 0. 0.
0.04422646 0. 0. 0. 0. 0.
0. 0. 0. 0. 0. 0.
0. 0. 0. 0. 0. 0.01124999
0. 0. 0. 0. 0. 0.
0. 0. 0. 0.00286965 0. 0.
0. 0. 0. 0. 0. 0.
0. 0. 0. 0. 0. 0.
0. 0. 0. 0. 0. 0.
0. 0. 0. 0. 0. 0.
0. 0. 0. 0. 0.00027195 0.
0. 0.02132209 0. 0. 0. 0.
0. 0. 0. 0. 0. 0.
0. 0.00511147 0. 0. 0.02347952 0.
0. 0. 0. 0. 0. 0.
0.02570331 0. 0. 0. 0. 0.03443285
0. 0. 0. 0. 0. 0.
0. 0.0068848 0. 0. 0. 0.
0. 0. 0. 0. 0.00936454 0.
0.00389365 0. 0. 0. 0. 0.
0. 0. 0. 0. 0. 0.
0. 0. 0. 0. 0. 0.
0. 0.00152553 0. 0. 0. 0.
0. 0. 0. 0. 0. 0.
0.09215052 0. 0. 0.0284613 0. 0.
0. 0. 0. 0. 0. 0.
0. 0. 0. 0. 0. 0.
0.00198757 0. 0. 0. 0. 0.
0. 0. 0. 0. 0. 0.
0. 0. 0. 0. 0. 0.
0. 0. 0. 0. 0. 0.
0. 0. 0. 0. 0. 0.
0. 0. 0. 0. 0.02395868 0.
0. 0. 0. 0. 0. 0.
0. 0. 0. 0. 0. 0.
0. 0. 0. 0. 0.01150922 0.0119792
0. 0. 0. 0. 0. 0.
0.00775307 0. 0. 0. 0. 0.
0. 0. 0. 0.01026413 0. 0.
0. 0. 0. 0. 0. 0.
0. 0. 0. 0. 0. 0.
0. 0. 0. 0. 0. 0.
0. 0.01522083 0. 0.00377031 0. 0.
0. 0. 0. 0. 0. 0.
0. 0.02235368 0. 0. 0. 0.
0. 0. 0. 0. 0.00317057 0.
0. 0. 0. 0. 0. 0.
0.03029975 0. 0. 0. 0. 0.
0. 0. 0.03843511 0. 0. 0.
0. 0. 0. 0. 0. 0.02327696
0.00557329 0. 0.02251234 0. 0. 0.
0. 0. 0. 0. 0. 0.
0. 0. 0. 0. 0. 0.
0. 0. 0. 0.01655817 0. 0.
0. 0. 0. 0. 0.00221658 0.
0. 0. 0. 0.02087847 0. 0.
0. 0. 0.02594821 0. 0. 0.
0. 0. 0.01515464 0. 0. 0.
0. 0. 0. 0. 0.00019883 0.
0. 0. 0. 0. 0. 0.00213376
0. 0. 0. 0. 0. 0.
0. 0. 0. 0. 0. 0.
0. 0. 0. 0. 0. 0.00237587
0. 0. 0. 0. 0. 0.
0. 0. 0. 0. 0. 0.
0. 0. 0. 0. 0. 0.
0. 0. 0. 0. 0. 0.
0. 0. 0. 0. 0. 0.
0.02521542 0. 0. 0. 0. 0.
0. 0. 0. 0. 0. 0.
0. 0. 0. 0. 0. 0.
0. 0. 0. 0. 0. 0.
0. 0. 0. 0. 0. 0.
0.00490679 0. 0.04504126 0. 0. 0.
0. 0. 0. 0. 0. 0.
0. 0. ]]
For more information, please refer the Section 5.4.1 Visualizing intermediate activations of the Book, Deep Learning Using Python
by Francois Chollet, Father of Keras.
Hope this helps. Happy Learning!