Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

even if there were some mixture-of-experts shenanigans going on, there is no introspection or reasoning, so the model isn’t able to comment on or understand its “inner experience”, if you can call matrix multiplications an inner experience


I was imagining system-prompt-based tool use, where the LLM "knows" it can call some calculator to get digits of pi




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: