Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

If I understand the distinction correctly, I run llamafile as a backend. I start it with the filename of a model on the command-line (might need a -M flag or something) and it will start up a chat-prompt for interaction in the terminal but also opens a port that speaks some protocol that I can connect to using a frontend (in my case usually gptel in emacs).


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: