Hacker Newsnew | past | comments | ask | show | jobs | submit | yujonglee's commentslogin

Super impressed and inspired!


kind of self-plug, but you might find https://github.com/fastrepl/hyprnote/blob/main/README.md interesting.

EDIT: typo


It can run Whisper and Moonshine models locally, while also allowing the use of other API providers. Read the docs - or at least this post.


I would want such information accessible without having to go hunt for it. You could improve your presentation by interposing fewer clicks between a reader and the thing they want to know.


The information is readily available in the open-your-eyes section.


> I would want such information accessible without having to go hunt for it.

Where exactly, if not in the FM?


what do you mean? this use-case is not llm. it is realtime stt.

also fyi - https://docs.hyprnote.com/owhisper/configuration/providers/o...


sure. `owhisper pull --help`


yes. metal is on


Thank you!


Its lot more than that.

- It supports other models like moonshine.

- It also works as proxy for cloud model providers.

- It can expose local models as Deepgram compatible api server


Thank you. Having it to operate a proxy server that other apps can connect to is really useful.


probably end of this month or early next month. not 100% sure.


yeah we use whisper.cpp for whisper inference. this is more like a community-focused project, not a commercial product!


Ya after spending a decent amount of time in r/localllama I was surprised that a project would want to name itself in association with Ollama, it’s got a pretty bad reputation in the community at this point.


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: