How do you use Keras LeakyReLU in Python?

Question:

I am trying to produce a CNN using Keras, and wrote the following code:

batch_size = 64
epochs = 20
num_classes = 5

cnn_model = Sequential()
cnn_model.add(Conv2D(32, kernel_size=(3, 3), activation='linear',
                     input_shape=(380, 380, 1), padding='same'))
cnn_model.add(Activation('relu'))
cnn_model.add(MaxPooling2D((2, 2), padding='same'))
cnn_model.add(Conv2D(64, (3, 3), activation='linear', padding='same'))
cnn_model.add(Activation('relu'))
cnn_model.add(MaxPooling2D(pool_size=(2, 2), padding='same'))
cnn_model.add(Conv2D(128, (3, 3), activation='linear', padding='same'))
cnn_model.add(Activation('relu'))
cnn_model.add(MaxPooling2D(pool_size=(2, 2), padding='same'))
cnn_model.add(Flatten())
cnn_model.add(Dense(128, activation='linear'))
cnn_model.add(Activation('relu'))
cnn_model.add(Dense(num_classes, activation='softmax'))

cnn_model.compile(loss=keras.losses.categorical_crossentropy,
                  optimizer=keras.optimizers.Adam(), metrics=['accuracy'])

I want to use Keras’s LeakyReLU activation layer instead of using Activation('relu'). However, I tried using LeakyReLU(alpha=0.1) in place, but this is an activation layer in Keras, and I get an error about using an activation layer and not an activation function.

How can I use LeakyReLU in this example?

Asked By: Jack Trute

||

Answers:

All advanced activations in Keras, including LeakyReLU, are available as layers, and not as activations; therefore, you should use it as such:

from keras.layers import LeakyReLU

# instead of cnn_model.add(Activation('relu'))
# use
cnn_model.add(LeakyReLU(alpha=0.1))
Answered By: desertnaut

Sometimes you just want a drop-in replacement for a built-in activation layer, and not having to add extra activation layers just for this purpose.

For that, you can use the fact that the activation argument can be a callable object.

lrelu = lambda x: tf.keras.activations.relu(x, alpha=0.1)
model.add(Conv2D(..., activation=lrelu, ...)

Since a Layer is also a callable object, you could also simply use

model.add(Conv2D(..., activation=tf.keras.layers.LeakyReLU(alpha=0.1), ...)

which now works in TF2. This is a better solution as this avoids the need to use a custom_object during loading as @ChristophorusReyhan mentionned.

Answered By: P-Gn

you can import the function to make the code cleaner and then use it like any other activation.

if you choose not to define alpha, don’t forget to add brackets "LeakyReLU()"

from tensorflow.keras.layers import LeakyReLU

model = tf.keras.Sequential()
model.add(tf.keras.layers.Dense(512, activation=LeakyReLU()))
model.add(tf.keras.layers.Dense(512, activation=LeakyReLU(alpha=0.1)))
Answered By: Adi Shumely