Wednesday, 16 June 2021

Adding a feature to a neuron

Say I have the following model:

import torch
import torch.nn as nn
import torch.optim as optim

class Model(nn.Module):
    def __init__(self):
        super(Model, self).__init__()
        self.fc1 = nn.Linear(1, 5)
        self.fc2 = nn.Linear(5, 10)
        self.fc3 = nn.Linear(10, 1)

    def forward(self, x):
        x = self.fc1(x)
        print(x) 
        x = torch.relu(x)        
        x = torch.relu(self.fc2(x))
        x = self.fc3(x)
        return x

net = Model()

opt = optim.Adam(net.parameters())
features = torch.rand((3,1)) #3 inputs, each of 1D

I can print the value of my neurons (only the first layer here) with print(x):

net(features)
>>>tensor([[ 0.6703,  0.4484, -0.8529,  1.3119,  0.6741],
        [ 0.9112,  0.6496, -1.2960,  1.8264,  0.4547],
        [ 0.7483,  0.5135, -0.9963,  1.4785,  0.6031]],
       grad_fn=<AddmmBackward>)
tensor([[0.0144],
        [0.0575],
        [0.0284]], grad_fn=<AddmmBackward>)

How can I add a "feature" to each neuron that is a string with a name? e.g.

print(x)
>>> tensor([[ [0.6703, 'neuron_1'],  [0.4484, 'neuron_2'], [-0.8529, 'neuron_3'],  1.3119,  0.6741],... etc.

I'm not sure if I'll need to change the neuron class. I believe in the forward method I will then need to only take the first element of each neurons tensor: neuron_tensor = [neuron_value, neuron_name]

Update 1: from @Aditya Singh Rathore comment it sounds like it might not be possible to have a string and a value in the same tensor. Is it possible then to have a value instead of a string to represent the neurons?

From before neuron_tensor = [neuron_value, neuron_name] where neuron_name is a string. Is this possible instead? : neuron_tensor = [neuron_value, neuron_name] where neuron_name is just a value (e.g 1 for neuron 1, 2 for neuron 2)



from Adding a feature to a neuron

No comments:

Post a Comment