custom classifier pytorch, I want to add softmax

Question:

I have this classifier:

input_dim = 25088
h1_dim = 4096
h2_dim = 2048
h3_dim = 1024
h4_dim = 512
output_dim = len(cat_to_name) # 102
drop_prob = 0.2

model.classifier = nn.Sequential(nn.Linear(input_dim, h1_dim),
                                 nn.ReLU(),
                                 nn.Dropout(drop_prob),
                                 nn.Linear(h1_dim, h2_dim),
                                 nn.ReLU(),
                                 nn.Dropout(drop_prob),
                                 nn.Linear(h2_dim, h3_dim),
                                 nn.ReLU(),
                                 nn.Dropout(drop_prob),
                                 nn.Linear(h3_dim, h4_dim),
                                 nn.ReLU(),
                                 nn.Dropout(drop_prob),
                                 nn.Linear(h4_dim, output_dim),
                                 )

and I went with CrossEntropyLoss as the criterion. In the validation and testing how can I add Softmax? This is the validation loop:

model.eval()
            with torch.no_grad():
                for images, labels in valid_loader:
                    images, labels = images.to(device), labels.to(device)
                    images.requires_grad = True

                    logits = model.forward(images)
                    batch_loss = criterion(logits, labels)
                    valid_loss += batch_loss.item()
                    
              
                    ps = torch.exp(logits)
                    top_p, top_class = ps.topk(1, dim=1)
                    equals = top_class == labels.view(*top_class.shape)
Asked By: Abdulsalam

||

Answers:

  • The CrossEntropyLoss already applies the softmax function. From the Pytorch doc:

Note that this case is equivalent to the combination of LogSoftmax and
NLLLoss.

So if you just want to use cross entropy loss, no need to apply SoftMax beforehand.

  • If you really wanted to use the SoftMax function anyway, you can do:
m = nn.Softmax(dim=1)
output = m(logits)

assuming your logits have a shape of (batch_size, number_classes)

You can check: https://pytorch.org/docs/stable/generated/torch.nn.Softmax.html
https://pytorch.org/docs/stable/generated/torch.nn.CrossEntropyLoss.html

Answered By: PlainRavioli
Categories: questions Tags: ,
Answers are sorted by their score. The answer accepted by the question owner as the best is marked with
at the top-right corner.