Sunday, 16 May 2021

What is tape-based autograd in Pytorch?

I understand autograd is used to imply automatic differentiation. But what exactly is tape-based autograd in Pytorch and why there are so many discussions that affirm or deny it.

For example:

this

In pytorch, there is no traditional sense of tape

and this

We don’t really build gradient tapes per se. But graphs.

but not this

Autograd is now a core torch package for automatic differentiation. It uses a tape-based system for automatic differentiation.

And for further reference, please compare it with GradientTape in Tensorflow.



from What is tape-based autograd in Pytorch?

No comments:

Post a Comment