Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Yes exactly! This is huge. Hessian optimization is really easy with JAX, haven't tried it in Julia though


Here's Hessian-Free Newton-Krylov on neural ODEs with Julia: https://diffeqflux.sciml.ai/dev/examples/second_order_adjoin... . It's just standard tutorial stuff at this point.


And very fast given that you compile the procedure! I am considering writing an article on this and posting it here because I have seen enormous improvements over non jitted code, and that excluded jax.vmap.


There's a comparison of JAX with PyTorch for Hessian calculation here!

https://www.assemblyai.com/blog/why-you-should-or-shouldnt-b...

Would definitely be interested in an article like that if you decide to write it




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: