Here is an implementation of weight maps in tensorflow http://tf-unet.readthedocs.io/en/latest/_modules/tf_unet/unet.html you should be able to adapt it for keras in a custom loss function. I report the relevant code:
def _get_cost(self, logits, cost_name, cost_kwargs):
Optional arguments are:
class_weights: weights for the different classes in case of multi-class imbalance
regularizer: power of the L2 regularizers added to the loss function
flat_logits = tf.reshape(logits, [-1, self.n_class])
flat_labels = tf.reshape(self.y, [-1, self.n_class])
if cost_name == "cross_entropy":
class_weights = cost_kwargs.pop("class_weights", None)
if class_weights is not None:
class_weights = tf.constant(np.array(class_weights, dtype=np.float32))
weight_map = tf.multiply(flat_labels, class_weights)
weight_map = tf.reduce_sum(weight_map, axis=1)
loss_map = tf.nn.softmax_cross_entropy_with_logits(flat_logits, flat_labels)
weighted_loss = tf.multiply(loss_map, weight_map)
loss = tf.reduce_mean(weighted_loss)
else:
loss = tf.reduce_mean(tf.nn.softmax_cross_entropy_with_logits(logits=flat_logits,
labels=flat_labels))}