As your learning the hard way stuffing blobs into a text database is probably the worst sin a data manager could commit as a novice, bloated unwieldy and slow it is best if the source files are left in their fast natural native compressed state and simply referenced in the DB by a related unique ID and file storage name. Rant over.
The fact that they are fixed size blocks of 40K suggests they are chunked in pieces thus several odd chunks needed to create one whole BLOB.
The blob you presented appears to be just part of a PNG image that should be, if I am interpreting correctly =
2164 pixels wide by 835 pixels high = 22.54 x 8.70 inches

HOWEVER the output is only 4 pixels high within that oddly suspect size canvas, which might be correct, if its just the first part of a much longer truncated stream. The colour range from such a narrow band does not help determine the subject matter however there appears to be a distinct near white margin down the right hand side, but not on the top or left edge?.
Your 40K chunk translates to about 20K binary with the characteristics of a PNG BUT a PNG STARTS WITH 89 so you are having a problem since that is prefixed with 0x 00 22 40 DD BF (decimal=574676415 thus too big for the expanded PNG memory requirement which is estimated to be 5,420,860 Bytes)
We can discard the 0x as the signature for a Hex stream and use the remainder as I did above, but what is the significance of the Odd 00 22 40 DD BF (most likely contains in part an indicator of the type or final full length size and or pointer to the next chunk)
What you need to do is extract that image by your normal method and compare the total expected file size, since translated into 20 KB binary it can only equate to a small 0.5 percent of the total to be expected. In that case you need to determine how & where the rest of the image is stored in order to concatenate all the (200 ?) parts into one homogeneous blob i.e. a single image.
You need to have sight of the method where chunks are extracted slowly converted slowly and stitched together slowly, but using some measure of expected file size. What we know is your entry has 5 bytes before the data body but the norm for a largeblob is 4 and for mediumblob should be 3 see https://www.educba.com/mysql-blob/ thus we have no idea why it is not normal other than it was done that way by a programmer. I
A fairly similar problem that i suggested needed knowledge of DB structure is at How to retrieve original pdf stored as MySQL mediumblob? the answer was interrogate the developer that had placed the data in an even more odd way than yours.