Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

The cause of behavior is absolutely relevant. The same apparent effect can often be attained by a variety of causes, after all. Otherwise, there is no difference between simulation and the real McCoy, no difference between truth and deceit. A liar by definition is someone who does not communicate faithfully his true intentions or beliefs and intentionally creates the mere appearance of doing so in order to deceive.

It is not a mystery why computers are not intelligent: they lack semantics.

Take the concept of a triangle (let's call it "triangularity"). This concept has to be distinguished from concrete triangles, because the concept is what we predicate of all triangles, so we cannot identify triangularity with any of these triangles or with all triangles. Why not? Well, if you identified triangularity with a particular triangle, then it would mean that there exists only one triangle; you cannot predicate a concrete thing of other things. And you cannot identify triangularity with all triangles, because this would make your reasoning circular: which objects are part of this class of triangles? Well, the triangles in the class, of course! And furthermore, you could not possibly know the entire class if triangles, and yet, through the analysis of the concept of triangularity, you can come to learn all there is about triangularity and thus everything that can be known about what is common and essential to all triangles.

So triangularity cannot be a matter of image, as images are concrete instances of triangles. Might there be an encoding, then, that encodes this triangularity? The answer again is "no". An encoding is not a carrier of meaning in the way concepts are. They are, in fact, devoid of the meaning they communicate; all meaning they have is assigned by the intelligent observer.

Consider the encoding of this very block of text I've posted. Objectively, these funny little shapes do not contain the concepts they communicate. As physical artifacts, they have no semantic content save for the content of their identity as physical objects (so, in a book, the physical meaning of the blobs of variously shaped pigment on paper are just that; in computers, perhaps the state of pixels or the state of an array of semiconductor cells). The words, the concepts — these belong to the writer (me) and to the reader (you). And if we have our language conventions aligned, communication is possible. But it is always the case that the writer and the reader always bring the semantics to any piece of writing. Without them, there is nothing. So concepts cannot be physical, as the physical is always concrete and particular. And in an interesting analogy: it is the human being reading intelligence into the behavior of LLMs. There is none in the LLM.

Computing devices are, of course, entirely physical, but computers are, strictly speaking, purely mathematical formalisms that physical machines only simulate. But even if we reify these mathematical constructs or identify them with physical machines, we are left with, at best, syntactic machines. And no amount of syntax will ever amount to any semantics. It is magical thinking to believe — and this belief has no justification — that somehow, without explanation (and usually appealing to ignorance), all the lead of syntax will magically implode under its own weight into the gold of semantics. But there is nothing in the nature of syntax that can accomplish this, and this is by definition.

So the tl;dr is: computers lack semantics and intentionality, which means they cannot, even in principle, be intelligent.



Why do you think that the human mind can contain semantics but a machine cannot? This argument needs some sort of dualism, or what Turing called "the objection from continuity" to account for this.

FWIW I don't think that the "triangularity" in my head is the true mathematical concept of "triangularity". When my son e.g. learned about triangles, at first the concept was just a particular triangle in his set of toy shapes. Then eventually I pointed at more things and said "triangle" and now his concept of triangle is larger and includes multiple things he has seen and sentences that people have said about triangles. I don't see any difficulty with semantics being "a matter of image", really.

Why do we believe that semantics can exist in the human mind but cannot exist in the internals of a machine?

Really "semantics"

I had come across this Catholic philosopher: https://edwardfeser.blogspot.com/2019/03/artificial-intellig... who seems to make a similar argument to this; i.e. that it's the humans who give meaning to things, "logical symbols on a piece of paper are just a bunch of meaningless ink marks"


> Why do you think that the human mind can contain semantics but a machine cannot? [...] Why do we believe that semantics can exist in the human mind but cannot exist in the internals of a machine?

Because I know human minds have semantic content (it would be incoherent to deny it, as the denial itself involves concepts), and because I know the definition of what a computer is, which is that it is a purely syntactic formalism. Anyone who knows the history of computer science will know that computation was intentionally defined as a purely syntactic process. And because it is syntactic, we can mechanize it using physical processes. And no amount of syntax ever amounts to semantics, just as so matter how many natural numbers you add, you'll never get a pineapple or even the number pi. How could it?

Whether this entails dualism or not depends on what you mean by "dualism". It does not entail Cartesian dualism, though a Cartesian dualist can accept this view as presented.

> seems to make a similar argument to this; i.e. that it's the humans who give meaning to things, "logical symbols on a piece of paper are just a bunch of meaningless ink marks"

We don't give meanings to things per se. The meaningless ink marks on a piece of paper mean just that: ink marks on a piece of paper. Those are still meanings. However, writing involves the instrumentalization of physical things to make conventional signs, and signs are things that stand in for something else. So, yes, we can make ink marks with which we associate certain meanings and agree to a convention so that we can communicate.

> FWIW I don't think that the "triangularity" in my head is the true mathematical concept of "triangularity".

What is the "true mathematical concept"?

Concepts can be vague (though triangularity per se is so crisp and simple that I reject the idea that you don't have a clear idea of "triangularity" as such), and we usually do not explicitly grasp all that's entailed by them. For example, people knew what triangles were before they learned that the sum of their angles is always 180 degrees. The latter falls out of an analysis of the concept. And this law applies to all triangles because it necessarily falls out of the concept of triangularity, not because we've empirically shown that all triangles seem to have this property, approximately.

> I don't see any difficulty with semantics being "a matter of image", really.

Your son, as he was learning, was abstracting from these individual examples. He realized that you don't mean this triangle, or that triangle, but something both have in common, and ultimately, that is triangularity, which is not just a property or feature of a given triangle, like "green" as in "green triangle", but the what of a triangle. But if you reduce concepts to images, you end up with problems and paradoxes. For example, why should a collection of these things, to the exclusion of those things, be triangles? Or the number three: you have never encountered the number three. Or the notion of similarity between images. There are well known issues with an imagist notion of the mind.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: