Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Can’t you just run ollama and provide it a localhost endpoint? I dont think its within scope to reproduce the whole local LLM stack when anyone wanting to do this today can easily use existing better tools to solve that part of it.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: