Friday, 5 January 2024

KerasTuner: Custom Metrics (e.g., F1 Score, AUC) in Objective with RandomSearch Error

I'm using KerasTuner for hyperparameter tuning of a Keras neural network. I would like to use common metrics such as F1 score, AUC, and ROC as part of the tuning objective. However, when I specify these metrics in the kt.Objective during RandomSearch, I encounter issues with KerasTuner not finding these metrics in the logs during training.

Here is an example of how I define my objective:

tuner = kt.RandomSearch(
    MyHyperModel(),
    objective=kt.Objective("val_f1", direction="max"),
    max_trials=100,
    overwrite=True,
    directory="my_dir",
    project_name="tune_hypermodel",
)

But I get:

RuntimeError: Number of consecutive failures exceeded the limit of 3.
Traceback (most recent call last):
  File "/usr/local/lib/python3.10/dist-packages/keras_tuner/src/engine/base_tuner.py", line 273, in _try_run_and_update_trial
    self._run_and_update_trial(trial, *fit_args, **fit_kwargs)
  File "/usr/local/lib/python3.10/dist-packages/keras_tuner/src/engine/base_tuner.py", line 264, in _run_and_update_trial
    tuner_utils.convert_to_metrics_dict(
  File "/usr/local/lib/python3.10/dist-packages/keras_tuner/src/engine/tuner_utils.py", line 132, in convert_to_metrics_dict
    [convert_to_metrics_dict(elem, objective) for elem in results]
  File "/usr/local/lib/python3.10/dist-packages/keras_tuner/src/engine/tuner_utils.py", line 132, in <listcomp>
    [convert_to_metrics_dict(elem, objective) for elem in results]
  File "/usr/local/lib/python3.10/dist-packages/keras_tuner/src/engine/tuner_utils.py", line 145, in convert_to_metrics_dict
    best_value, _ = _get_best_value_and_best_epoch_from_history(
  File "/usr/local/lib/python3.10/dist-packages/keras_tuner/src/engine/tuner_utils.py", line 116, in _get_best_value_and_best_epoch_from_history
    objective_value = objective.get_value(metrics)
  File "/usr/local/lib/python3.10/dist-packages/keras_tuner/src/engine/objective.py", line 59, in get_value
    return logs[self.name]
KeyError: 'val_f1'

I would be very thankful if someone could directly guide me to the actual metrics available on the Keras documentation because I have searched and searched, and I can't seem to find them. The only snippet of code that has worked for me is using the accuracy metric like this

import keras_tuner as kt
import pandas as pd
from sklearn.model_selection import train_test_split
from sklearn.metrics import classification_report
from tensorflow import keras
from tensorflow.keras import layers
from tensorflow.keras.models import Sequential
from tensorflow.keras.optimizers import Adam
from tensorflow.keras.regularizers import l2
from kerastuner.tuners import RandomSearch


class MyHyperModel(kt.HyperModel):
    def build(self, hp):
        model = Sequential()
        model.add(layers.Flatten())
        model.add(
            layers.Dense(
                units=hp.Int("units", min_value=24, max_value=128, step=10),
                activation="relu",
            )
        )
        model.add(layers.Dense(1, activation="sigmoid"))  
        model.compile(
            optimizer=Adam(learning_rate=hp.Float('learning_rate', 5e-5, 5e-1, step=0.001)),#,Adam(learning_rate=hp.Float('learning_rate', 5e-5, 5e-1, sampling='log')),
            loss='binary_crossentropy',
            metrics=['accuracy']
        )
        return model

    def fit(self, hp, model, *args, **kwargs):
        return model.fit(
            *args,
            batch_size=hp.Choice("batch_size", [16, 32,52]),
            epochs=hp.Int('epochs', min_value=5, max_value=25, step=5),
            **kwargs,
        )


tuner = kt.RandomSearch(
    MyHyperModel(),
    objective="val_accuracy",
    max_trials=100,
    overwrite=True,
    directory="my_dir",
    project_name="tune_hypermodel",
)

tuner.search(X_train, y_train, validation_data=(X_test, y_test), callbacks=[keras.callbacks.EarlyStopping('val_loss', patience=3)])

Is it possible that Keras only supports accuracy as the default metric, and we'll have to define any other metric ourselves? I would be very thankful if you could help me find the documentation or kindly show me how to define objective metrics for AUC and F1. Thank you so much!



from KerasTuner: Custom Metrics (e.g., F1 Score, AUC) in Objective with RandomSearch Error

No comments:

Post a Comment