Issue between number of classes and shape of inputs in metric collection torch
Question:
I have a problem because I want to calculate some metrics in torchmetrics. but there is a problem:
ValueError: The implied number of classes (from shape of inputs) does not match num_classes.
The output is from UNet and the loss function is BCEWithLogitsLoss (binary segmentation)
channels = 1 because of grayscale img
Input shape: (batch_size, channels, h, w) torch.float32
Label shape: (batch_size, channels, h, w) torch.float32 for BCE
Output shape: (batch_size, channels, h, w): torch.float32
inputs, labels = batch
outputs = model(input)
loss = self.loss_function(outputs, labels)
prec = torchmetrics.Precision(num_classes=1)(outputs, labels.type(torch.int32)
Answers:
It seems that torchmetrics
expects different shape. Try to flatten both output and labels:
prec = torchmetrics.Precision(num_classes=1)(outputs.view(-1), labels.type(torch.int32).view(-1))
I was using the torchmetrics library for calculating F1 score, Precision and Recall for segmentation task; I was trying to get F1 score for both of my individual classes when I encountered the above error, this solution works but first of all I had to set ‘multi_class=True
‘ with ‘num_classes=2
‘
torchmetrics_f1_none = torchmetrics.classification.F1Score(average=None, num_classes=2, multiclass=True)
f1_0, f1_1 = torchmetrics_f1_none(thres_out.view(-1), masks.int().view(-1))
print("F1 Score for Background - {}, F1 Score for Foreground - {} n".format(f1_0, f1_1))
I have a problem because I want to calculate some metrics in torchmetrics. but there is a problem:
ValueError: The implied number of classes (from shape of inputs) does not match num_classes.
The output is from UNet and the loss function is BCEWithLogitsLoss (binary segmentation)
channels = 1 because of grayscale img
Input shape: (batch_size, channels, h, w) torch.float32
Label shape: (batch_size, channels, h, w) torch.float32 for BCE
Output shape: (batch_size, channels, h, w): torch.float32
inputs, labels = batch
outputs = model(input)
loss = self.loss_function(outputs, labels)
prec = torchmetrics.Precision(num_classes=1)(outputs, labels.type(torch.int32)
It seems that torchmetrics
expects different shape. Try to flatten both output and labels:
prec = torchmetrics.Precision(num_classes=1)(outputs.view(-1), labels.type(torch.int32).view(-1))
I was using the torchmetrics library for calculating F1 score, Precision and Recall for segmentation task; I was trying to get F1 score for both of my individual classes when I encountered the above error, this solution works but first of all I had to set ‘multi_class=True
‘ with ‘num_classes=2
‘
torchmetrics_f1_none = torchmetrics.classification.F1Score(average=None, num_classes=2, multiclass=True)
f1_0, f1_1 = torchmetrics_f1_none(thres_out.view(-1), masks.int().view(-1))
print("F1 Score for Background - {}, F1 Score for Foreground - {} n".format(f1_0, f1_1))