Sunday 11 August 2013

Consciousness is not like Photosynthesis

I've been posting some comments on Massimo Pigliucci's blog recently in defense of Strong AI, after a couple of posts attacking the computational theory of mind.

In the first, he uses John Searle's Chinese Room argument to demonstrate that a computer cannot be conscious. If you've been paying attention to this blog, you should know what I think of that argument!

In the second post, he uses an article by Jack Copeland which undermines the position of philosophers who take the Church-Turing thesis to imply that a Turing machine can do any computation performable by any machine. Unfortunately, Pigliucci seems to fail to realise that while Copeland is technically correct, the philosophers he criticises are no less so unless some very unlikely propositions turn out to be true (i.e. that the laws of physics are not computable and the brain is a hypercomputer).

But what motivated me to write this post is the same tired old argument that keeps coming up again and again in these discussions, the analogy to photosynthesis.

Pigliucci is a biological naturalist, which means that he thinks that although the brain is essentially just a machine that obeys the laws of physics like any other ball of matter, there's something unknown which is intrinsic to the biology of brains which gives rise to consciousness. In other words, you probably can't have consciousness without biology, and certainly not until we understand that biology well enough to physically mimic it. In other words, computation is not enough.

He likes to explain this position and why he doubts the computational theory of mind by analogy to photosynthesis. This analogy came up repeatedly in a discussion with the wonderful Eliezer Yudkowsky on bloggingheads.tv, despite Yudkowsky's efforts to explain why it doesn't work (efforts which in my view succeed entirely).
Massimo Pigliucci: ... photosynthesis [...] is very well understood at a chemical level [... and at ...] the logical processing level. 
[...]

You can in fact simulate those chemical reactions inside a computer. [...] The fact is of course that you can simulate all you want, the one thing you're not going to get out of the simulation is sugar, which is the outcome of photosynthesis. And the reason for that is because what's important about photosynthesis is not just the logical structure of the process, it's the actual physical particular implementation. Unless you have certain kinds of chemicals that work in a certain way [...] you just don't have photosynthesis.
 - Edited from video discussion transcript

This harks back to arguments by John Searle and others making joking statements to the effect that we don't need to call the fire department if a simulated fire has got out of control, or that a simulated hurricane isn't likely to flood your apartment.

These arguments are all the same and they are all very poor.

The reason is simple. The brain is an information processing machine, and so the only product the brain needs to make reliably is output. Output is not a physical product, it is information. Like a novel, which can be paperback or hardback. audio book or ebook, as long as the content is the same then nothing essential has changed.

When I pointed this out to Massimo, he responded with astonishment as though I claimed that the brain did not produce any physical chemicals at all.

Of course this is not my position. I am well aware that the brain produces many hormones, neurotransmitters and other chemicals that have an impact on the brain and indeed the body at large. However, as long as we keep in mind the self-evident proposition that the job of the brain as an information processing device is to process information, the chemicals are not important as ends in themselves (unless consciousness is a chemical!). If the brain were replaced by a perfectly equivalent electronic device, these chemicals would not be produced but the information processing would be unaffected. Whatever functions the chemicals had once implemented would have been replaced by an electronic analogue.

In photosynthesis, the product is sugar -- a physical substance. Of course you can't get physical real-world sugar from a simulation of photosynthesis. The best you can hope for is virtual sugar.

So why don't we say that the best we could hope for from a simulated brain is virtual output? It should be clear that that doesn't really make sense. Unlike virtual sugar, virtual output is pretty much the same thing as real output, and can be put to real world use. Information cannot be virtual because it is not physical in the first place.

But output is not Massimo's concern. He is concerned with consciousness. Perhaps he thinks that the brain's job is to produce consciousness, or perhaps consciousness is just a necessary part of its information processing job. In either case, unless consciousness is a physical substance or a state of physical matter, (notions as outmoded as the vitalist assumption that a substance called élan vital is responsible for life), he and his colleagues have no business making analogies to photosynthesis, hurricanes or fire.

According to the computational theory of mind (or strong AI as I have been referring to it in previous posts), consciousness is not a physical substance but an aspect of certain sophisticated ongoing computations. If this is correct, then consciousness achieved in a simulation is every bit as real as that achieved in a physical brain, for the same reason that a simulated calculator is still a calculator.

Quite simply, there is no such thing as a simulated computation. Once you're simulating a computation, you're doing the computation.

So please, please, please stop making analogies to physical processes when there is every reason to believe that consciousness is not a physical phenomenon.

9 comments:

  1. Hi Disagreeable,

    Thank you for posting that transcript. I found Massimo's contribution so confused that I just felt I was wasting my time reading it. I didn't even manage to establish whether he thinks a computer AI could be behaviourally isomorphic to a human brain. First he seems to deny it. Then he seems to agree with it. But if he agrees, it's hard to see what relevance biology has to the Singularity. If a computer AI can be behaviourally isomorphic to a human brain, then it can do the things a human brain can do, including developing computer systems like AIs.

    ReplyDelete
    Replies
    1. Hi Richard,

      He seems to be agnostic on whether a computer can be behaviourally like a human brain. I agree that means he doesn't have much to say about the singularity, as he's much more interested in whether such a computer would be conscious - and he says it wouldn't.

      For my part, I don't have much to say about the singularity because I'm agnostic on whether AI will ever be achieved. I think it should be possible in principle but have no idea how feasible it might be.

      Delete
  2. > The reason is simple. The brain is an information processing machine, and so the only product the brain needs to make reliably is output. Output is not a physical product, it is information. Like a novel, which can be paperback or hardback. audio book or ebook, as long as the content is the same then nothing essential has changed. <

    Do you have a post where you outline this more carefully? It seems to me to always be where the contention between my position and yours is. You just assert that the brain is an information processing machine but never demonstrate why or show how that gives explanatory power. You make it sound like like information is ontologically basic or a discrete, could you convince me that is true?

    ReplyDelete
    Replies
    1. Hi Louis,

      Really great to hear from you again.

      It might help if I understood the precise nature in which your view differs from mine. Do you think it is nonsensical to regard the brain as an information processing machine, or do you think that this happens not to be true in fact?

      I see the mind/brain/body as analogous to software/computer/robot. In the case of a robot with an onboard computer, it is absolutely clear to me that the computer is an information processing machine, producing not a substance but information in the form of commands to the motors of the body.

      Would you disagree with this description of a robot? Do you think it inappropriately treats information as ontologically basic? If so, could I fix it by just changing my wording? For instance, I'd be happy to say instead "The brain is unlike photosynthesis because its job is not to produce a physical substance but to control the body via signals".

      As to why I think the brain is actually like a computer, that goes to the standard arguments for computationalism, of which there are many on this blog. In condensed form, the argument is:

      1. Everything that happens in the universe depends on physical laws
      2. All physical laws are mathematical
      3. All physical laws are computable (this may not be true but it seems to be approximately true and it's a reasonable assumption. It makes the argument easier. If it is not true I still think the brain is essentially a computer however the argument is more involved).
      4. That which is computable can in principle be simulated on a computer.
      5. A brain is a physical object.
      6. A brain can in principle be simulated on a computer (including its body and environment if necessary)
      7. The simulated brain must behave identically to a real brain.
      8. The computer running the simulated brain therefore has all the functionality of a real brain.
      9. Therefore it is possible for a computational algorithmic process to exhibit the same information processing as a biological brain.
      10. Therefore all evolution needs to evolve if it needs a centre for co-ordination and control is a biological computer.
      11. Therefore, if qualia and consciousness are divorceable from the functional aspects of a computation, nature didn't need to evolve qualia and consciousness.
      12. Therefore qualia and consciousness are probably not divorceable from the functional aspects of a computation, and a brain is in all respects equivalent to a computer running an appropriate algorithm.

      Now, there is another respect in which a brain is not simply an information processing machine, and this is that parts of the brain function as glands that secrete endocrine hormones that have more global effects on the body. I'm excluding these from consideration as I think this is a separate issue.

      Delete
    2. Hi DM,
      Thanks for your response and sorry for the slow response. I'm highly constrained in my time and resources so I apologise that I can't give your post the response it deserves. I feel that above list is nascent or that's the impression I get from it. That is to say there are jumps where I have to take you on your word or do you address each point in your blog?
      Respect,
      Lou

      Delete
    3. Also how do you dismiss anti-representational accounts like Enactivism? http://en.wikipedia.org/wiki/Enactivism_(psychology)

      Delete
    4. Hi Louis,

      That list was nascent in the sense that it was off the top of my head. That said, from my blinkered perspective it seems reasonably complete, and no holes jump out at me. It might help if you pointed out where the holes lie. Possibly they are addressed elsewhere on my blog, and if not those holes need to be patched!

      Enactivism doesn't make much sense to me, going on the wikipedia description. Perhaps if I read a book on the subject I'd be able to get my head around it. It seems however that the enactivist perspective is perhaps similar in some respects to Searle's claim that "you can't get semantics without syntax!", paralleling the enactivist claim that there is no content (semantics), only vehicles (syntax). The difference is that Searle denies computationalism because it doesn't account for semantics, while Enactivism denies computationalism because it doesn't dispense with semantics!

      Perhaps my response to "semantics from syntax" will explain my position.

      http://disagreeableme.blogspot.co.uk/2012/11/in-defence-of-strong-ai-semantics-from.html

      Delete
    5. > 11. Therefore, if qualia and consciousness are divorceable from the functional aspects of a computation, nature didn't need to evolve qualia and consciousness.
      12. Therefore qualia and consciousness are probably not divorceable from the functional aspects of a computation, and a brain is in all respects equivalent to a computer running an appropriate algorithm. <

      If I understand correctly, your argument is something like - "If nature could have gotten along just fine without evolving consciousness, it would have. But it didn't, therefore consciousness is probably inseparable from the functional aspects of computation and thus a computer could be conscious."

      There are plenty of outputs of evolution that are not teleological but simply "side effects" or "accidents". Perhaps its simply easier to make a biological brain that's really smart if you allow consciousness than to make one that lacks consciousness, but both are possible.

      I happen to agree with the conclusion that a simulated brain would exhibit consciousness, but I don't think this is a sound argument. I don't have a better argument, other than to simple stop at #7 on your list above.

      Delete
    6. Hi BazookaJoe,

      I agree that it's not ironclad, nor is it meant to be. My conclusion from this is only that consciousness is *probably* not divorceable from the functional aspects of a computation.

      I agree that in principle there could be something special about biology or neurons that makes it easier to make a smart conscious brain than a smart unconscious brain, but it seems implausible without further argument to support it.

      I think intuitively, consciousness, if it is something more than the functional aspects viewed from the first person, is something rather special and mysterious and we should be suspicious of its being simply an accident serving no evolutionary purpose. I think most of the other fitness-neutral evolutionary side effects you allude to (e.g. a particular sequence of junk DNA, for instance) are decidedly more prosaic.

      Thanks for stopping by.

      Delete