I was trying to use the tensorboard histogram facility to plot the weights of my trained network. My CNN is based on AlexNet, and is being used as an estimator (motivation for this can found here). However I am having trouble understanding the histogram generated; specifically what the vertical axis means. I have viewed similar posts by others(this one, this one and this one), however those either:
- talk about changes in the histogram across layers
- refer to the distribution tab on tensorboard (used to be called histogram on previous tensorboard...I think)
The weights histogram at the first convolutional layer is shown below.
What does the 61.8 at 0.0490 actually mean? I know what histograms are in general, but how can you have 61.8 of something? There was a github post that mentioned that it was related to density, but doesn't really make clear what they mean by density (other than vaguely mentioning normalised density...whatever that means)? If it was probability density, then shouldn't the area under the graph equal 1? Any clarification on what the vertical axis means would be much appreciated, especially how it's supposed to be interpreted. Also, if I have missed another post that my question appears to duplicate, please let me know.