Saturday, May 27, 2017

Reflections on "Toward a Noncomputational Cognitive Neuroscience"

Note on where this came from: I was bored one afternoon and one of my friends posted this paper ("Toward a Noncomputational Cognitive Neuroscience") on Facebook. I have nothing against it, I just wanted to do some writing and analysis since it had been a while, and this happened to show up. I do feel like my tone ended up being overly harsh, and I guess it did annoy me a little, but it also had some interesting ideas and who knows how off I am—so please do check it out on your own if you're curious.

Since the 'computational' is such an exceedingly far-reaching category, I keep an eye out for things which fundamentally can't be included in it. I only know of one subject totally outside its bounds: immediate conscious experience, all the qualia currently present—the subject matter of phenomenology.

Unfortunately, "Toward a Noncomputational Cognitive Neuroscience," doesn't seem to supply any new instances of the non-computational. Its central argument appears to be a straw man (but mine may in part be too, so make sure to check out [0]): it takes overly restrictive definitions of computation and information processing and then demonstrates how its alternate approach does not fall under said restrictive definitions. Yes, contemporary artificial neural nets don’t adjust their connection weights, transfer functions etc., while operating (i.e. after training), and a dynamical systems analysis of a system which does this is qualitatively different from one which does not—but computation has no intrinsic limits that would prevent someone from architecting such a neural net, even using the paper's narrow definition of computation which requires rules to be operating on representations. 

The paper's view on the difference between simplified and realistic nets is most concisely stated here:

"It is the processing of representations that qualifies simplified nets as computational. In realistic nets, however, it is not the representations that are changed; it is the self-organizing process that changes via chemical modulation. Indeed, it no longer makes sense to talk of 'representations.'"

Why would a computation which changes its own rules no longer be a computation? Such computations are at the very heart of computation theory! Additionally, the sense in which the system is no longer operating on representations can only be superficial, since at some level of interpretation, representations are still obviously a component of cognition. The difference is just that they emerge at a higher level, rather than being explicitly defined things which the base system explicitly operates on.

The separate argument about digital computers being classical, non-chaotic dynamical systems also falls flat, in my opinion. The sense in which computers are classical systems is superficial: if I can write a program for it which, at the appropriate level of interpretation, is nonlinear and chaotic—what does it matter if the substrate is classical? If you insist on only modeling the state space of the substrate, rather than something higher level, sure, it’s always classical—but what do you gain by doing that?

The bit on connecting Freud back to the super abstract dynamical systems stuff was a pretty neat idea I thought. Would be interesting to see if there’s a good fit with any ideas of, e.g., William James or Jung.

On the other hand, I suspect the paper's attempt to incorporate the ideas of Derrida et al is part of a flawed justification for considering its notion of the noncomputational to be more significant than it really is.

"Out of that intersection of ‘self’ and ‘other’ the dynamic whole evolves in its spontaneous, unexpectedly bifurcating manner. So the brain does not compute; it permits and supports ‘participation’ between self and other in the evolving whole."

I read that, and many other arguments from the paper, as being circumlocutions avoiding saying, “the system is bottom-up, not top-down.”

"The outside is not represented inside but participates on the inside as a constraint on a self-organizing process."

As far as I can tell, this usage of ‘to participate’ is just refers to something which: 'can’t be modeled top-down', and 'is one among other things involved.' Seems like it's largely, indirectly saying that cognition is emergent—and, erroneously, claiming that you can't create emergent, chaotic systems in classical computers:

"Computation as understood by the tradition is not performed by chaotic systems. Computer computation is not sensitively dependent on initial conditions."

And yet it is capable of executing programs which are.

------------------------------------


[0] To be fair though the sense in which it is a straw man is only this: it doesn't say anything significant about computation in general, only about a special restricted Computation which it is interested in (and which I'm sure many other academics are interested in as well). The reason I take issue with it (it addition to the clickbaitiness of 'Noncomputational') is because the paper also comes with the suggestion of a paradigm shift for Cognitive Neuroscience—but this special Computation it's found a negation of isn't sufficient to constitute a paradigm shift. Anyway, considering that, much of my review may itself be a straw man. Oh well.

No comments:

Post a Comment