Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

I've used it too locally. It is great for some kind of querries or writing bash, which I refuse to learn properly.

I really don't want my querries to leave my computer, ever.

It is quite surreal how this 'open weights' model get so little hype.



It helps to be able to run the model locally, and currently this is slow or expensive. The challenges of running a local model beyond say 32B are real.


Ye the compressed version is not nearly as good.

I would be fine though with like 10 times the wait time. But I guess consumer hardware need some serius 'ram pipeline' upgrade for big models to be run at crawl speeds.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: