How can I tune the optimization function with Keras Tuner?

Question:

How can I tune the optimization function with Keras Tuner? I want to try SGD, Adam and RMSprop.

I tried:

hp_lr = hp.Choice('learning_rate', values=[1e-2, 1e-3, 1e-4])
hp_optimizer = hp.Choice('optimizer', values=[SGD(learning_rate=hp_lr), RMSprop(learning_rate=hp_lr), Adam(learning_rate=hp_lr)])

model.compile(optimizer=hp_optimizer,
              loss="sparse_categorical_crossentropy",
              metrics=["accuracy"])

but this doesn’t work as "A Choice can contain only one type of value"

Asked By: Filippos Vlahos

||

Answers:

Probably the best way is to do something like this:

hp_optimizer = hp.Choice('optimizer', values=['sgd', 'rmsprop', 'adam'])

if hp_optimizer == 'sgd':
    optimizer = SGD(learning_rate=hp_lr)
elif hp_optimizer == 'rmsprop':
    optimizer = RMSprop(learning_rate=hp_lr)
elif hp_optimzier == 'adam':
    optimizer = Adam(learning_rate=hp_lr)
else:
    raise

model.compile(optimizer=optimizer,
              loss="sparse_categorical_crossentropy",
              metrics=["accuracy"])

Obviously you’d want a more descriptive exception (or just leave it since it should never occur anyway). Even if the different optimizers were of the same class, IIRC hp.Choice only allows ints, floats, bools and strings, so I don’t see a way around doing it like this.

Answered By: xdurch0

The question was answered a long time ago, but maybe someone will need an alternative solution that is working in my setting.:

    hp_learning_rate = hp.Choice('learning_rate', 
                                 values=[1e-2, 1e-3, 1e-4]
                                 )
    optimizers_dict = {
        "Adam":    Adam(learning_rate=hp_learning_rate),
        "SGD":     SGD(learning_rate=hp_learning_rate),
        "Adagrad": Adagrad(learning_rate=hp_learning_rate)
        }

    hp_optimizers = hp.Choice(
        'optimizer', 
        values=["Adam", "SGD", "Adagrad"]
        )

    model.compile(
        loss='binary_crossentropy',
        optimizer=optimizers_dict[hp_optimizers],
        metrics=[
            AUC(curve="ROC", name="roc"),
            AUC(curve="PR", name="pr"),
            Precision(name='precision'),
            f1_score
            ]
        )

Answered By: Tomasz Waszczewski