> Humans don't require input to, say, decide to go for a walk.
Impossible to falsify since humans are continuously receiving inputs from both external and internal sensors.
> What's missing in the LLM is volition.
What's missing is embodiment, or, at least, a continuous loop feeding a wide variety of inputs about the state of world. Given that, and info about of set of tools by which it can act in the world, I have no doubt that current LLMs would exhibit some kind (possibly not desirable or coherent, from a human POV, at least without a whole lot of prompt engineering) of volitional-seeming action.
Your subjective experience is only the tip of the iceberg of your entire brain activity. The conscious part is merely a tool your brain uses to help it achieve its goals, there's no inherent reason to favor it.
LLMs can absolutely generate output without input but we don’t have zero input. We don’t exist in a floating void with no light or sound or touch or heat or feelings from our own body.
But again this doesn’t see to be the same thing as thinking. If I could only reply to you when you send me a message but could reason through any problem we discuss just like “able to want a walk” me could, would that mean I no longer could think? I think these are different issues.
On that though, these see trivially solvable with loops and a bit of memory to write to and read from - would that really make the difference for you? A box setup to run continuously like this would be thinking?
It's as if a LLM is only one part of a brain, not the whole thing.
So of course it doesn't do everything a human does, but it still can do some aspects of mental processes.
Whether "thinking" means "everything a human brain does" or whether "thinking" means a specific cognitive process that we humans do, is a matter of definition.
I'd argue that defining "thinking" independently of "volition" is a useful definition because it allows us to break down things in parts and understand them
Humans don't require input to, say, decide to go for a walk.
What's missing in the LLM is volition.