Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Makes sense, and each model has a max context length, so they could charge per token assuming full context by model if they wanted to assume worst case.


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: