Saturday, 6 July 2019

Sampled Softmax in Keras Model

Some approaches I have considered:

Inheriting from Model class Sampled softmax in tensorflow keras

Inheriting from Layers class How can I use TensorFlow's sampled softmax loss function in a Keras model?

Of the two approaches the Model approach is cleaner, as the layers approach is a little hacky - it pushes in the target as part of the input and then bye bye multi-output models.

I'd like some help in subclassing the Model class - Specifically: 1) Unlike the first approach - I would like to take in any number of layers as we do in specifying a standard keras model. For example,

class LanguageModel(tf.keras.Model):
    def __init__(self, **kwargs)

2)I am looking to incorporate within the model class the below code -but want to let the Model class recognize that

def call(self, y_true, input):
        """ reshaping of y_true and input to make them fit each other """
        input = tf.reshape(input, (-1,self.hidden_size))
        y_true = tf.reshape(y_true, (-1,1))
      weights = tf.Variable(tf.float64))
      biases = tf.Variable(tf.float64)
      loss = tf.nn.sampled_softmax_loss(
      weights=weights,
      biases=biases,
      labels=labels,
      inputs=inputs,
      ...,
      partition_strategy="div")
      logits = tf.matmul(inputs, tf.transpose(weights))
      logits = tf.nn.bias_add(logits, biases)
       y_predis = tf.nn.softmax_cross_entropy_with_logits_v2(
                                labels=inputs[1],
                                logits=logits) 





3 I guess i need some pointers to which sections of the Model class in the functional API should I mess with -knowing I have to write a custom loss function like above. I guess the issue is accessing the weights in the tf.nn.sampledsoftmax function



from Sampled Softmax in Keras Model

No comments:

Post a Comment