Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

In PyTorch:

  a = torch.tensor([2., 3.], requires_grad=True)
  b = torch.tensor([6., 4.], requires_grad=True)
  Q = 3*a**3 - b**2
  external_grad = torch.tensor([1., 1.])
  Q.backward(gradient=external_grad)
  print(a.grad, b.grad) # the computed gradients.
All this is done on the GPU. Automatic Differentiation is the workhorse of modern NN.


Sure, that's how you use it but it doesn't explain how it works, unlike the article. :)




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: