he_normal kernel initialization and global average pooling

Question:

I’m trying to implement he_normal kernel initialization and global average pooling in my model, but I don’t know how to do it.

#beginmodel
model = Sequential([
    Conv2D(16, 3, padding='same', activation='relu', input_shape=(100, 100,1)),
    MaxPooling2D(),
    Conv2D(32, 3, padding='same', activation='relu'),
    MaxPooling2D(),
    Conv2D(64, 3, padding='same', activation='relu'),
    MaxPooling2D(),
    Conv2D(128, 3, padding='same', activation='relu'),
    MaxPooling2D(),
    Flatten(),
    Dense(215, activation='relu'),
    Dense(10)
])
Asked By: Smurf Again

||

Answers:

Every keras layer has an initializer argument so u can use it to pass your initializer method (he_normal is present in keras).

Global average pooling for images reduces the dimension of the network to 2D. it can be used instead of flatten operation.

I suggest u also to use a softmax activation in your last layer to get probability score if u are carrying out a classification problem.

here an example

n_class, n_samples = 10, 3
X = np.random.uniform(0,1, (n_samples,100,100,1))
y = np.random.randint(0,n_class, n_samples)

model = Sequential([
    Conv2D(16, 3, padding='same', activation='relu', kernel_initializer='he_normal',
           input_shape=(100, 100,1)),
    MaxPooling2D(),
    Conv2D(32, 3, padding='same', activation='relu', kernel_initializer='he_normal'),
    MaxPooling2D(),
    Conv2D(64, 3, padding='same', activation='relu', kernel_initializer='he_normal'),
    MaxPooling2D(),
    Conv2D(128, 3, padding='same', activation='relu', kernel_initializer='he_normal'),
    GlobalAvgPool2D(),
    Dense(215, activation='relu'),
    Dense(n_class, activation='softmax')
])

model.compile('adam', 'sparse_categorical_crossentropy')
model.fit(X,y, epochs=3)
Answered By: Marco Cerliani