I think your definition of "reasoning" may be "think like a human" - in which case obviously LLMs can't reason because they aren't human.
I think your definition of "reasoning" may be "think like a human" - in which case obviously LLMs can't reason because they aren't human.