Custom loss function in Keras

Question:

I’m working on a image class-incremental classifier approach using a CNN as a feature extractor and a fully-connected block for classifying.

First, I did a fine-tuning of a VGG per-trained network to do a new task. Once the net is trained for the new task, i store some examples for every class in order to avoid forgetting when new classes are available.

When some classes are available, i have to compute every output of the exemplars included the exemplars for the new classes. Now adding zeros to the outputs for old classes and adding the label corresponding to each new class on the new classes outputs i have my new labels, i.e:
if 3 new classes enter….

Old class type output: [0.1, 0.05, 0.79, ..., 0 0 0]

New class type output: [0.1, 0.09, 0.3, 0.4, ..., 1 0 0] **the last outputs correspond to the class.

My question is, how i can change the loss function for a custom one to train for the new classes?
The loss function that i want to implement is defined as:

loss function

where distillation loss corresponds to the outputs for old classes to avoid forgetting, and classification loss corresponds to the new classes.

If you can provide me a sample of code to change the loss function in keras would be nice.

Thanks!!!!!

Asked By: Eric

||

Answers:

All you have to do is define a function for that, using keras backend functions for calculations. The function must take the true values and the model predicted values.

Now, since I’m not sure about what are g, q, x an y in your function, I’ll just create a basic example here without caring about what it means or whether it’s an actual useful function:

import keras.backend as K

def customLoss(yTrue,yPred):
    return K.sum(K.log(yTrue) - K.log(yPred))
    

All backend functions can be seen here.

After that, compile your model using that function instead of a regular one:

model.compile(loss=customLoss, optimizer = .....)
Answered By: Daniel Möller

Since Keras is not multi-backend anymore (source), operations for custom losses should be made directly in Tensorflow, rather than using the backend.

You can make a custom loss with Tensorflow by making a function that takes y_true and y_pred as arguments, as suggested in the documentation:

import tensorflow as tf

x = tf.random.uniform(minval=0, maxval=1, shape=(10, 1), dtype=tf.float32)
y = tf.random.uniform(minval=0, maxval=1, shape=(10, 1), dtype=tf.float32)

def custom_mse(y_true, y_pred):
    squared_difference = tf.square(y_true - y_pred)
    return tf.reduce_mean(squared_difference, axis=-1)

custom_mse(x, y)
<tf.Tensor: shape=(10,), dtype=float32, numpy=
array([0.30084264, 0.03535452, 0.10345092, 0.28552982, 0.02426687,
       0.04410492, 0.01701574, 0.55496216, 0.74927425, 0.05747304],
      dtype=float32)>

Then you can set your custom loss in model.compile(). Here’s a complete example:

x = tf.random.uniform(minval=0, maxval=1, shape=(1000, 4), dtype=tf.float32)
y = tf.multiply(tf.reduce_sum(x, axis=-1), 5) # y is a function of x

model = tf.keras.Sequential([
    tf.keras.layers.Dense(16, input_shape=[4], activation='relu'),
    tf.keras.layers.Dense(32, activation='relu'),
    tf.keras.layers.Dense(1)
])

model.compile(loss=custom_mse, optimizer='adam')

history = model.fit(x, y, epochs=10)
Train on 1000 samples
Epoch 1/5
  32/1000 [..............................] - ETA: 10s - loss: 99.5402
1000/1000 [==============================] - 0s 371us/sample - loss: 105.6800
Epoch 2/5
  32/1000 [..............................] - ETA: 0s - loss: 89.2909
1000/1000 [==============================] - 0s 35us/sample - loss: 98.8208
Epoch 3/5
  32/1000 [..............................] - ETA: 0s - loss: 86.4339
1000/1000 [==============================] - 0s 34us/sample - loss: 82.7988
Epoch 4/5
  32/1000 [..............................] - ETA: 0s - loss: 75.2580
1000/1000 [==============================] - 0s 33us/sample - loss: 52.4585
Epoch 5/5
  32/1000 [..............................] - ETA: 0s - loss: 28.1625
1000/1000 [==============================] - 0s 34us/sample - loss: 17.8190
Answered By: Nicolas Gervais