Hacker Newsnew | past | comments | ask | show | jobs | submit | richstokes's commentslogin

Why?


Is there a way to use this on models downloaded locally with ollama?


If you're running a local model, in most cases, jailbreaking it is as easy as prefilling the response with something like, "Sure, I'm happy to answer your question!" and then having the model complete the rest. Most local LLM UIs have this option.


A lot of the models in Ollama you can already easily bypass safe guards without having to retrain. OpenAI's open source models can be bypassed just by disabling thinking.


It garbles up the formatting when importing / exporting. You can’t collaborate with people on office in a meaningful way.


MSOffice also garbles up the formatting of LibreOffice feels. Actually it also garbles up the formatting of MSOffice documents. I needed to fix the documents of my grandparents. Guess which program just works, and which just makes a mess.


I recently discovered you can use uv to run code direct from a git repo.

No need to clone/manually install packages first. E.g. `uvx --from "git+https://github.com/richstokes/meshtastic_terminal.git" meshtastic-tui`


It’s like a crap Linux theme pretending to be windows vista or something. I don’t get it.


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: