Sunday, 11 September 2022

Apply math operation dynamically between two nn.Modules or loss functions

I would like to apply math operations dynamically between two loss functions or nn.Modules or python objects. It could be also treated as a problem to generate dynamic graphs in pytorch.

For example: In the below example, I would like to add two loss functions.

nn.L1Loss() + nn.CosineEmbeddingLoss()

If I do this, it gives me an error:

----> 1 nn.L1Loss() + nn.CosineEmbeddingLoss()
TypeError: unsupported operand type(s) for +: 'L1Loss' and 'CosineEmbeddingLoss'

I also tried creating a wrapper with forward function and torch operations like below, but it doesn’t work either. In the below case x and y can be any loss functions and op can be any math operation like addition, and subtraction, and so on.

class Execute_Op(nn.Module):
    def __init__(self):
        super().__init__()
        
    def forward(self, x, y, op):
        if op == 'add':
            return torch.add(x, y)
        elif op == 'subtract':
            return torch.subtract(x - y)

exec_op = Execute_Op()
exec_op(nn.L1Loss(), nn.CosineEmbeddingLoss(), 'add')

It gives error like the below:

Execute_Op.forward(self, x, y, op)
      5 def forward(self, x, y, op):
      6     if op == 'add':
----> 7         return torch.add(x, y)
      8     elif op == 'subtract':
      9         return torch.subtract(x - y)

TypeError: add(): argument 'input' (position 1) must be Tensor, not L1Loss

I am aware of functional APIs and the general way to pass truth values and predicted values to the loss function. But in that case, I cannot combine loss functions dynamically at run time.

I am not sure how exactly to implement it. But any help is really appreciated. Also, if there is a pythonic way or Pytorch way to do this, it would be great.

Edited:

  • I would like to call this function/class recursively.


from Apply math operation dynamically between two nn.Modules or loss functions

No comments:

Post a Comment