Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Why would you get costs with a local model?


yeah that's what I'm saying - it would eliminate inference costs. What I was asking is how feasible is it to package these local llms with another standalone app. For ex. a game


Oh sorry. Hm..I actually have no idea. It sounds like a neat idea though. :)




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: