Add loss in SRGAN

Question:

I want to add loss to SRGAN
https://github.com/leftthomas/SRGAN
in train.py

g_loss = generator_criterion(fake_out, fake_img, real_img)

Can I write a function myself like:

def ContentLoss(a, b):
     result = 0
 
     for x, y in zip(a, b):
         shape = x.shape
         k = np.prod(shape[0:])
         diff = x - y
         #l2 norm
         diff = np. sqrt(np. sum(np. square(diff)))
         diff = diff*diff
         diff = diff / k
         result = result + diff
        
 
     return result

And add it to the original loss as follows:

a = ContentLoss(a,b)
g_loss = generator_criterion(fake_out, fake_img, real_img) + a

Is there a way to calculate the gradient of this loss during training?

Asked By: Steven

||

Answers:

The project you linked uses PyTorch. Assuming that you are also using it, you can just implement your loss using PyTorch instead of numpy and your’re covered.

import torch
import math


def ContentLoss(a, b):
     result = 0
     for x, y in zip(a, b):
         shape = x.shape
         k = math.prod(shape[0:])  # you can also use np, but not torch
         diff = x - y
         #l2 norm
         # just replacing np with torch
         diff = torch.sqrt(torch.sum(torch.square(diff)))
         diff = diff*diff  # torch differentiates through these as well
         diff = diff / k
         result = result + diff
     return result

# I assume you only want gradients for a (are these your model outputs?)
# This is just test data anyway. If this is your model output, 
# it will pass on the gradients to the weights.
a = [
    torch.tensor([1.0, 1.0, 1.0], requires_grad=True),
    torch.tensor([0.0, -1.0, 1.0], requires_grad=True),
    torch.tensor([7.0, 6.0, -5.0], requires_grad=True),
]
b = [
    torch.tensor([1.0, 0.0, 1.0]),
    torch.tensor([1.0, -2.0, 3.0]),
    torch.tensor([0.0, 0.0, 0.0]),
]

loss = ContentLoss(a, b)
loss.backward()  # computes the gradients
for x in a:
    print(f"a={a}, gradient={a.grad}")

If you need to keep numpy, you have to create a torch.autograd.Function and implement forward and backward: https://pytorch.org/tutorials/beginner/examples_autograd/two_layer_net_custom_function.html.

Answered By: cherrywoods
Categories: questions Tags: ,
Answers are sorted by their score. The answer accepted by the question owner as the best is marked with
at the top-right corner.