I'm working with Tensorflow and am using code which has been implemented for Tensorflow 1 (https://github.com/openai/maddpg/blob/master/maddpg/common/tf_util.py). While migrating code to tensorflow 2, I'm able to make most of the changes using the literature provided online. However, I'm not able to find a suitable alternative to line 145:
gradients = optimizer.compute_gradients(objective, var_list=var_list)
Which throws an error
Attribute Error: 'Adam' object has no attribute 'compute_gradient'
Since this function no longer exists, what are the possible alternatives I can use? I have read that it is possible to use the following function instead:
gradients = optimizer.get_gradients(objective, var_list)
This throws a value error
ValueError: Variable <tf.Variable 'agent_0/q_func/fully_connected/weights:0' shape=(9, 64)
dtype=float32> has `None` for gradient. Please make sure that all of your ops have a
gradient defined (i.e. are differentiable). Common ops without gradient: K.argmax,
K.round, K.eval.
Versions: tensorflow 2.4.1 tensorflow-estimator 2.4.0
from While migrating code from Tensorflow 1 to Tensorflow 2, how do I deal with the attribute error: 'Adam' object has no attribute 'compute_gradient'?
No comments:
Post a Comment