This is because that PNG has a palette, so pixel data are palette indices. np.asarray
takes raw pixel data, the palette is not taken into account.
Use .get_palette() is not None
on the PIL image object to detect if the image has a palette and .convert()
to convert pixel data to "real" colors.
This is a part of raw data, to get you an idea what it looks like. It includes a corner of the capital 'C' in the inscription:

The strange thing here that catches the eye is that while black is 0
(expectedly), white is for some reason 1
(rather than the expected 255
) and other colors are higher but still small, the highest value being 20
. Which hints that this is a palette thing.
When you .imshow
this data, it's normalized to [0,1]
, mapping those 20 points, in order, to points equally apart from each other on the spectrum of the Colormap
used.
png-parser
can show the palette data:
$ png-parser -d amdNt.png
<...>
Filename: amdNt.png | Size: 2925
(0)
IHDR:
Data size : 13
- Width : 225
- Height : 225
- Bit depth : 8
- Color type : Code = 3 ; Depth Allow = [1, 2, 4, 8] ; Each pixel is a palette index; a PLTE chunk must appear.
- Compression method : 0
- Filter method : 0
- Interlace method : 0
(1)
PLTE:
Data size : 69
b'\x00\x00\x00\xff\xff\xff\xfe\xfe\xfe\x01\x01\x01\xfd\xfd\xfd\xb4\xb4\xb4\xb2\xb2\xb2\xb6\xb6\xb6\xaf\xaf\xaf\x05\x05\x05\xfa\xfa\xfa\x10\x10\x10\xbb\xbb\xbb\x16\x16\x16\xb8\xb8\xb8\x0e\x0e\x0e\xf5\xf5\xf5\xaa\xaa\xaa\xc0\xc0\xc0\xf0\xf0\xf0\x19\x19\x19\xc4\xc4\xc4\xa6\xa6\xa6'
<...>
as well as the palette itself:
