Hacker News
new
|
past
|
comments
|
ask
|
show
|
jobs
|
submit
login
tonygiorgio
on Feb 2, 2025
|
parent
|
context
|
favorite
| on:
Goose: An open-source, extensible AI agent that go...
Can’t you just run ollama and provide it a localhost endpoint? I dont think its within scope to reproduce the whole local LLM stack when anyone wanting to do this today can easily use existing better tools to solve that part of it.
Guidelines
|
FAQ
|
Lists
|
API
|
Security
|
Legal
|
Apply to YC
|
Contact
Search: