Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

This model was trained on 6T tokens and has 256k embeddings, quite different than a gpt2 model comparable in size.


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: