autodiff

How to wrap a numpy function to make it work with jax.numpy?

How to wrap a numpy function to make it work with jax.numpy? Question: I have some Jax code that requires using auto differentiation and in part of the code, I would like to call a function from a library written in NumPy. When I try this now I get The numpy.ndarray conversion method __array__() was …

Total answers: 1

pytorch sets grad attribute to none if I use simple minus instead of -=

pytorch sets grad attribute to none if I use simple minus instead of -= Question: This is a simple code to show the problem import torch X = torch.arange(-3, 3, step=0.1) Y = X * 3 Y += 0.1 * torch.randn(Y.shape) def my_train_model(iter): w = torch.tensor(-15.0, requires_grad=True) lr = 0.1 for epoch in range(iter): print(w.grad) …

Total answers: 1