I am currently programming an autoencoder for image compression. From a previous post I have now final confirmation that I cannot use pure Python functions as loss functions neither in Keras nor in tensorflow. (And I am slowly beginning to understand why ;-)
I would like to do some experiments using the ssim as a loss function and as a metric. Now it seems I might be lucky. There is already an implementation of it in tensorflow, see: https://www.tensorflow.org/api_docs/python/tf/image/ssim
tf.image.ssim( img1, img2, max_val )
In addition, bsautermeister kindly provided an implementation here on stackoverflow: SSIM / MS-SSIM for TensorFlow.
My question now is: how would I use it as a loss function, with the mnist data set? The function does not accept a tensor but only two images. And, will the gradient be automatically computed? From what I understand it should if the function is implemented in tensorflow or keras backend.
I would be very gratefull for a minimum working example (MWE) on how to use any of the previously mentioned ssim implementations as a loss function either in keras or tensorflow.
Maybe we can use my MWE for an autoencoder provided with my previous question: keras custom loss pure python (without keras backend)
If it is not possible to glue my keras autoencoder together with the ssim implemenations would it be possible with an autoencoder directly implemented in tensorflow? I have that, too, and can provide it?
I am working with python 3.5, keras (with tensorflow backend) and if necessary tensorflow directly. Currently I am using the mnist dataset (the one with the numbers).
Thanks for any help!
(P.S.: Several people seem to be working on similar things. An answer to this post may also be useful for Keras - MS-SSIM as loss function)