0

I'm trying to learn deep learning using keras and tensorflow and I came across a code explaining linear regression at https://www.tensorflow.org/tutorials/keras/regression wherein they have created a normalization layer using normalizer = tf.keras.layers.Normalization(axis=-1). Someone please explain the meaning of axis =-1 . I tried looking at the API documentation but I couldnt understand the explanation from there?I know that axis=0 represent rows and axis=1 columns, right? Thanks in advance

ADITYA
  • 1
  • 1
  • Please see https://stackoverflow.com/questions/47435526/what-is-the-meaning-of-axis-1-in-keras-argmax – jav Sep 21 '22 at 02:36

1 Answers1

0

Per the documentation this layer is:

A preprocessing layer which normalizes continuous features.

Then, under the description of axis:

Defaults to -1, where the last axis of the input is assumed to be a feature dimension and is normalized per index.

From these two statements, we can see that -1 in this context just means the last axis. This is actually fairly common in Python, for example you can index for the last element in a list using -1.

Kraigolas
  • 5,121
  • 3
  • 12
  • 37