> The moment somebody at OpenAI reads the QuietSTaR paper (GPT-2 scale network with GPT-3 quality!) and understands what it means, the timer to AGI begins.
I think parametric memory and accelerated grokking are equally promising. The convergence of all of these will dramatically improve reasoning. The next few years will be interesting indeed.
I think parametric memory and accelerated grokking are equally promising. The convergence of all of these will dramatically improve reasoning. The next few years will be interesting indeed.