Friday, June 28, 2019

Perception as controlled hallucination - predictive processing and the nature of conscious experience

I've now read several times through a fascinating Edge.org conversation with philosopher Andy Clark. I suggest you read the piece, and here pass on some edited clips. First, his comments on most current A.I. efforts:
There's something rather passive about the kinds of artificial intelligence ...[that are]...trained on an objective function. The AI tries to do a particular thing for which it might be exposed to an awful lot of data in trying to come up with ways to do this thing. But at the same time, it doesn't seem to inhabit bodies or inhabit worlds; it is solving problems in a disembodied, disworlded space. The nature of intelligence looks very different when we think of it as a rolling process that is embedded in bodies or embedded in worlds. Processes like that give rise to real understandings of a structured world.
Then, his ideas on how our internal and external worlds are a continuum:
Perception itself is a kind of controlled hallucination. You experience a structured world because you expect a structured world, and the sensory information here acts as feedback on your expectations. It allows you to often correct them and to refine them. But the heavy lifting seems to be being done by the expectations. Does that mean that perception is a controlled hallucination? I sometimes think it would be good to flip that and just think that hallucination is a kind of uncontrolled perception.
The Bayesian brain, predictive processing, hierarchical predictive coding are all, roughly speaking, names for the same picture in which experience is constructed at the shifting borderline between sensory evidence and top-down prediction or expectation. There's been a big literature out there on the perceptual side of things. It's a fairly solid literature. What predictive processing did that I found particularly interesting—and this is mostly down to a move that was made by Karl Friston—was apply the same story to action. In action, what we're doing is making a certain set of predictions about the shape of the sensory information that would result if I were to perform the action. Then you get rid of prediction errors relative to that predicted flow by making the action.
There's a pleasing symmetry there. Once you've got action on the table in these stories—the idea is that we bring action about by predicting sensory flows that are non actual and then getting rid of prediction errors relative to those sensory flows by bringing the action about—that means that epistemic action, as it's sometimes called, is right there on the table. Systems like that cannot just act in the world to fulfill their goals; they can also act in the world so as to get better information to fulfill their goals. And that's something that active animals do all the time. The chicken, when it bobs its head around, is moving its sensors around to get information that allows it to do depth perception that it can't do unless it bobs its head around...Epistemic action, and practical action, and perception, and understanding are now all rolled together in this nice package.
An upshot here is that there's no experience without the application of some model to try to sift what is worthwhile for a creature like you in the signal and what isn't worthwhile for a creature like you.
Apart from the exteroceptive signals that we take in from vision, sound, and so on, and apart from the proprioceptive signals from our body that are what we predict in order to move our body around, there's also all of the interoceptive signals that are coming from the heart and from the viscera, et cetera...being subtly inflected by interoception information is part of what makes our conscious experience of the world the kind of experience that it is. So, artificial systems without interoception could perceive their world in an exteroceptive way, they could act in their world, but they would be lacking what seems to me to be one important dimension of what it is to be a conscious human being in the world.

1 comment:

  1. That talk seemed to me to be a beacon of sanity in a subject which is prone to flights of fancy. Perception as controlled hallucination is a very insightful metaphor. We are engulfed by naive realism, "green" trees, "hot" fires, etc, and the big one, the illusion of unified consciousness, but a little physics tells us that the contents of consciousness bears a similar relationship to the real world that words in a book do. It's a (selective) encoding.

    ReplyDelete