Saturday, 20 August 2022

F1 Score metric per class in Tensorflow

I have implemented the following metric to look at Precision and Recall of the classes I deem relevant.

metrics=[tf.keras.metrics.Recall(class_id=1, name='Bkwd_R'),tf.keras.metrics.Recall(class_id=2, name='Fwd_R'),tf.keras.metrics.Precision(class_id=1, name='Bkwd_P'),tf.keras.metrics.Precision(class_id=2, name='Fwd_P')]

How can I implement the same in Tensorflow 2.5 for F1 score (i.e specifically for class 1 and class 2, and not class 0, without a custom function.


Update

Using this metric setup:

tfa.metrics.F1Score(num_classes = 3, average = None, name = f1_name)

I get the following during training:

13367/13367 [==============================] 465s 34ms/step - loss: 0.1683 - f1_score: 0.5842 - val_loss: 0.0943 - val_f1_score: 0.3314

and when I do model.evaluate:

224/224 [==============================] - 11s 34ms/step - loss: 0.0665 - f1_score: 0.3325

and the scoring =

Score: [0.06653735041618347, array([0.99740255, 0.        , 0.        ], dtype=float32)]

The problem is that this is training based on the average, but I would like to train on the F1 score of a sensible averaging/each of the last two values/classes in the array (which are 0 in this case)



from F1 Score metric per class in Tensorflow

No comments:

Post a Comment