Saturday, 22 October 2022

Best way to implicitly change the value of nn.Parameter() in Pytorch?

Suppose that I want to optimize a vector v so that its norm is equal to 1. To do that, I defined a network with that vector as follows:

class myNetwork(nn.Module):
    def __init__(self,initial_vector):
        super(myNetwork, self).__init__()
        #Define vector according to an initial column vector
        self.v = nn.Parameter(initial_vector)
    def forward(self,x):
        #Normalize vector so that its norm is equal to 1
        self.v.data = self.v.data / torch.sqrt(self.v.data.transpose(1,0) @ self.v.data) 
        #Multiply v times a row vector 
        out = x @ self.v
        return out 

Is the use of .data the best way to update v? Does it takes into account the normalization during backpropagation?



from Best way to implicitly change the value of nn.Parameter() in Pytorch?

No comments:

Post a Comment