Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Another interesting point is that from a numerical optimization point of view, the Jacobian transpose method is analogous to gradient descent, while using its (pseudo) inverse is basically the Gauss-Newton algorithm.


Very interesting. I like the connections to Gradient Descent and Gauss-Newton Algorithm.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: